Matt: > On Tue, Oct 27, 2015 at 11:13 AM, Hong <[email protected]> wrote: > >> Gary : >> I tested your mat.bin using >> petsc/src/ksp/ksp/examples/tutorials/ex10.c >> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view >> ... >> Mat Object: 1 MPI processes >> type: seqaij >> rows=588, cols=588 >> total: nonzeros=11274, allocated nonzeros=11274 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 291 nodes, limit used is 5 >> Number of iterations = 0 >> Residual norm 24.2487 >> > >> It does not converge, neither hangs. >> > This is the default GMRES/ILU. Hong
> As you said, matrix is non-singular, LU gives a solution >> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -pc_type lu >> 0 KSP preconditioned resid norm 3.298891225772e+03 true resid norm >> 2.424871130596e+01 ||r(i)||/||b|| 1.000000000000e+00 >> 1 KSP preconditioned resid norm 1.918157196467e-12 true resid norm >> 5.039404549028e-13 ||r(i)||/||b|| 2.078215409241e-14 >> Number of iterations = 1 >> Residual norm < 1.e-12 >> >> Is this the same matrix as you mentioned? >> > > Hong, could you run ILU on it as well? > > Thanks, > > Matt > > >> Hong >> >> >>> >>> >>> On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <[email protected]> >>> wrote: >>> >>> On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <[email protected][ >>> [email protected]]> wrote: >>> >>> Dear petsc-users, >>> >>> While using the FEniCS package to Solve a simple Stokes' flow problem, I >>> have run into problems with PETSc preconditioners. In particular, I would >>> like to use ILU (no parallel version) along with GMRES to solve my linear >>> system but the solver just hangs indefinitely >>> at MatLUFactorNumeric_SeqAIJ_Inode without outputting anything. CPU usage >>> is at 100% but even for a tiny system (59x59 for minimal test case), the >>> solver does not seem to manage to push through it after 30 mins. >>> >>> PETSc version is 3.6 and the matrix for the minimal test case is as >>> follows : >>> http://pastebin.com/t3fvdkaS[http://pastebin.com/t3fvdkaS] >>> >>> Hanging is a bug. We will check it out. >>> >>> I do not have any way to read in this ASCII. Can you output a binary >>> version >>> >>> -mat_view binary:mat.bin >>> >>> Thanks, >>> >>> Matt >>> >>> >>> It contains zero diagonal entries, has a condition number of around 1e3 >>> but is definitely non-singular. Direct solvers manage to solve the system >>> as well as GMRES without preconditioner (although after many iterations for >>> a 59x59 system..). >>> >>> This will never work. Direct solvers work because they pivot away the >>> zeros, but ILU is defined by having no pivoting. >>> >>> Thanks, >>> >>> Matt >>> >>> >>> Playing with the available options here >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html[http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html] >>> did not seem to solve the issue (even after activating diagonal_fill and/or >>> nonzeros_along_diagonal) although sometimes error 71 is returned which >>> stands for zero pivot detected. Are there yet other options that I have not >>> considered? The default ILU factorization in MATLAB returns satisfactory >>> problems without errors so surely it must be possible with PETSc? >>> >>> As for the choice of ILU, I agree it might be suboptimal in this setting >>> but I do need it for benchmarking purposes. >>> >>> Best regards, >>> >>> Gary >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener >
