Hello Konrad,

I thing  PETSc PCILU does not work in parallel. You could use the 
preconditioner PETScWrappers::PreconditionBlockJacobi  that uses an ILU 
preconditioner for each process block. 

https://www.dealii.org/current/doxygen/deal.II/classPETScWrappers_1_1PreconditionBlockJacobi.html

Also youcould use HYPRE parallell ILU from petsc if you have installed it.

https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCHYPRE.html#PCHYPRE

 Or any other preconditioner that works in parallel. 

El divendres, 20 setembre de 2019 10:16:05 UTC+2, Konrad Simon va escriure:
>
> Dear deal.ii community,
>
> I am using deali.ii with PETSc and Trilinos. However, when I am using the 
> PETSc PreconditionILU I get an error that suggests that a solver package is 
> missing (with Trilinos it works). Petsc's PreconditionAMG works fine 
> (although not very efficiently for my problem). 
> Do I need to do any special configuration steps for PETSC? I followed the 
> instructions that are documented on the deal.ii pages on "how to configure 
> Petsc". https://www.dealii.org/current/external-libs/petsc.html
>
> Best,
> Konrad
>
> This is the error:
>
> Running using PETSc. 
> Number of active cells: 262144 
> Total number of cells: 24865 (on 7 levels) 
> Number of degrees of freedom: 1609920 (811200+798720) 
> [0]PETSC ERROR: --------------------- Error Message 
> -------------------------------------------------------------- 
> [0]PETSC ERROR: See 
> http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for 
> possible LU and Cholesky solvers 
> [0]PETSC ERROR: Could not locate a solver package. Perhaps you must 
> ./configure with --download-<package> 
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
> for trouble shooting. 
> [0]PETSC ERROR: Petsc Release Version 3.9.4, Sep, 11, 2018  
> [0]PETSC ERROR: main on a x86_64 named thunder5 by u290231 Fri Sep 20 
> 10:00:23 2019 
> [0]PETSC ERROR: Configure options --with-shared-libraries=1 --with-x=0 
> --with-mpi=yes --download-hypre=yes --with-64-bit-indices 
> --with-debugging=yes --with-hypre=yes 
> [0]PETSC ERROR: #1 MatGetFactor() line 4328 in 
> /scratch/cen/numgeo/lib/petsc-3.9.4/src/mat/interface/matrix.c 
> [0]PETSC ERROR: #2 PCSetUp_ILU() line 142 in 
> /scratch/cen/numgeo/lib/petsc-3.9.4/src/ksp/pc/impls/factor/ilu/ilu.c 
> [0]PETSC ERROR: #3 PCSetUp() line 923 in 
> /scratch/cen/numgeo/lib/petsc-3.9.4/src/ksp/pc/interface/precon.c 
> --------------------------------------------------------- 
> TimerOutput objects finalize timed values printed to the 
> screen by communicating over MPI in their destructors. 
> Since an exception is currently uncaught, this 
> synchronization (and subsequent output) will be skipped to 
> avoid a possible deadlock. 
> --------------------------------------------------------- 
> WARNING! There are options you set that were not used! 
> WARNING! could be spelling mistake, etc! 
> Option left: name:-p value: ../MsFEComplex/parameter.in 
> ERROR: Uncaught exception in MPI_InitFinalize on proc 0. Skipping 
> MPI_Finalize() to avoid a deadlock. 
>
>
> ---------------------------------------------------- 
> Exception on processing:  
>
> -------------------------------------------------------- 
> An error occurred in line <421> of file 
> </scratch/cen/numgeo/lib_compile/dealii-9.1.1/source/lac/petsc_precondition.cc>
>  
> in function 
>    void dealii::PETScWrappers::PreconditionILU::initialize(const 
> dealii::PETScWrappers::MatrixBase&, const 
> dealii::PETScWrappers::PreconditionILU::AdditionalData&) 
> The violated condition was:  
>    ierr == 0 
> Additional information:  
> deal.II encountered an error while calling a PETSc function. 
> The description of the error provided by PETSc is "See 
> http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for 
> possible LU and Cholesky solvers". 
> The numerical value of the original error code is 92. 
> -------------------------------------------------------- 
>
> Aborting! 
> ---------------------------------------------------- 
> -------------------------------------------------------------------------- 
> Primary job  terminated normally, but 1 process returned 
> a non-zero exit code. Per user-direction, the job has been aborted. 
> -------------------------------------------------------------------------- 
> -------------------------------------------------------------------------- 
> mpirun detected that one or more processes exited with non-zero status, 
> thus causing 
> the job to be terminated. The first process to do so was: 
>
>  Process name: [[37149,1],0] 
>  Exit code:    1 
> --------------------------------------------------------------------------
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/bfb44563-cab3-4f36-b5ce-7a19489a25ed%40googlegroups.com.

Reply via email to