Dear Hamed,

I think you can only use the TrilinosWrappers::SolverDirect classes for 
TrilinosWrappers::MPI::Vector 's and of course TrilinosWrappers::Vector 's.

Unfortunately, I've no experience with  PETScWrappers::SparseDirectMUMPS 
<https://dealii.org/8.4.1/doxygen/deal.II/classPETScWrappers_1_1SparseDirectMUMPS.html>,
 
and I don't think we have included that to candi so far.
(You need to install mumps against your mpi compiler and point to that 
during the petsc installation)

Best regards
  Uwe


On Tuesday, October 25, 2016 at 3:17:39 AM UTC+2, Hamed Babaei wrote:
>
> Hi friends,
>
> I am parallelizing a code similar to step-44 in which it is possible to 
> use either an iterative solver SolverCG or a direct solver, 
> SparseDirectUMFPACK. I have used the latter in the non-parallel code and 
> works great.
> Using iterative solvers like SolverCG I have problem in convergence so I 
> want to check a direct solver which works in parallel. My problem is that 
> my code doesn't recognize  PETScWrappers::SparseDirectMUMPS 
> <https://dealii.org/8.4.1/doxygen/deal.II/classPETScWrappers_1_1SparseDirectMUMPS.html>
>   
> nor TrilinosWrappers::SolverDirect 
> <https://www.dealii.org/8.4.0/doxygen/deal.II/classTrilinosWrappers_1_1SolverDirect.html>
>  .
> I have installed Dealii and all of its dependent libraries (Petsc, 
> Trilinos, P4est ...) via Candi (https://github.com/koecher/candi). I was 
> wondering which direct solver I should use which works the same as 
> SparseDirectUMFPACK and how to make dealii know them.
>
> Thanks,
> Hamed
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to