Thank you, Mark.
However, doing this with my toy code
mpirun -n 1 ./testpreconditioner -pc_type gamg 
-pc_gamg_use_parallel_coarse_grid_solver -mg_coarse_pc_type jacobi 
-mg_coarse_ksp_type cg

I get 16 inf elements. Do I miss anything?

Thanks again

Marco Cisternino


From: Mark Adams <[email protected]>
Sent: lunedì 21 marzo 2022 17:31
To: Marco Cisternino <[email protected]>
Cc: [email protected]
Subject: Re: [petsc-users] Null space and preconditioners

And for GAMG you can use:

-pc_gamg_use_parallel_coarse_grid_solver -mg_coarse_pc_type jacobi 
-mg_coarse_ksp_type cg

Note if you are using more that one MPI process you can use 'lu' instead of 
'jacobi'

If GAMG converges fast enough it can solve before the constant creeps in and 
works without cleaning in the KSP method.

On Mon, Mar 21, 2022 at 12:06 PM Mark Adams 
<[email protected]<mailto:[email protected]>> wrote:
The solution for Neumann problems can "float away" if the constant is not 
controlled in some way because floating point errors can introduce it even if 
your RHS is exactly orthogonal to it.

You should use a special coarse grid solver for GAMG but it seems to be working 
for you.

I have lost track of the simply way to have the KSP solver clean the constant 
out, which is what you want.

can someone help Marco?

Mark





On Mon, Mar 21, 2022 at 8:18 AM Marco Cisternino 
<[email protected]<mailto:[email protected]>> wrote:
Good morning,
I’m observing an unexpected (to me) behaviour of my code.
I tried to reduce the problem in a toy code here attached.
The toy code archive contains a small main, a matrix and a rhs.
The toy code solves the linear system and check the norms and the mean of the 
solution.
The problem into the matrix and the rhs is the finite volume discretization of 
the pressure equation of an incompressible NS solver.
It has been cooked as tiny as possible (16 cells!).
It is important to say that it is an elliptic problem with homogeneous Neumann 
boundary conditions only, for this reason the toy code sets a null space 
containing the constant.

The unexpected (to me) behaviour is evident by launching the code using 
different preconditioners, using -pc-type <pctype>
I tested using PCNONE (“none”), PCGAMG (“gamg”) and PCILU (“ilu”). The default 
solver is KSPFGMRES.
Using the three PC, I get 3 different solutions. It seems to me that they 
differ in the mean value, but GAMG is impressive.
PCNONE gives me the zero mean solution I expected. What about the others?

Asking for residuals monitor, the ratio ||r||/||b|| shows convergence for 
PCNONE and PCILU (~10^-16), but it stalls for PCGAMG (~10^-4).
I cannot see why. Am I doing anything wrong or incorrectly thinking about the 
expected behaviour?

Generalizing to larger mesh the behaviour is similar.

Thank you for any help.

Marco Cisternino


Reply via email to