Thank you, Matt.
I rechecked and the mean value of the solution is 1.0-15!!
Clumsily,  I was checking an integral average and on an octree mesh it is not 
exactly the same.

Thanks again!

Marco Cisternino


From: Matthew Knepley <[email protected]>
Sent: martedì 22 marzo 2022 15:22
To: Marco Cisternino <[email protected]>
Cc: Barry Smith <[email protected]>; [email protected]
Subject: Re: [petsc-users] Null space and preconditioners

On Tue, Mar 22, 2022 at 9:55 AM Marco Cisternino 
<[email protected]<mailto:[email protected]>> wrote:
Thank you Barry!
No, no reason for FGMRES (some old tests showed shorter wall-times relative to 
GMRES), I’m going to use GMRES.
I tried GMRES with GAMG using PCSVD on the coarser level on real cases, larger 
than the toy case, I cannot get machine epsilon mean value of the solution, as 
in the toy case but about 10e-5, which is much better than before. The rest of 
GAMG is on default configuration. The mesh is much more complicated being an 
octree with immersed geometries. I get the same behaviour using GMRES+ASM+ILU 
as well.

This does not seem possible. Setting the null space will remove the mean value. 
Something is not configured correctly here. Can you show -ksp_view
for this solve?

  Thanks,

    Matt

I cannot really get why the solution is not at zero mean value. I’m using a 
rtol~1.0e-8 but I get the same mean value using rtol~1.o-6, I tried 1.0-12 too, 
but no improvement in the constant, definitely tiny but not zero.
Thanks.

Marco Cisternino

From: Barry Smith <[email protected]<mailto:[email protected]>>
Sent: lunedì 21 marzo 2022 19:49
To: Marco Cisternino 
<[email protected]<mailto:[email protected]>>
Cc: Mark Adams <[email protected]<mailto:[email protected]>>; 
[email protected]<mailto:[email protected]>
Subject: Re: [petsc-users] Null space and preconditioners


  Marco,

     I have confirmed your results.

     Urgg, it appears we do not have something well documented. The removal of 
the null space only works for left preconditioned solvers and FGMRES only works 
with right preconditioning. Here is the reasoning.

     The Krylov space for left preconditioning is built from [r, BAr, (BA)^2 r, 
...] and the solution space is built from this basis. If A has a null space of 
n then the left preconditioned Krylov methods simply remove n from the "full" 
Krylov space after applying each B preconditioner and the resulting "reduced" 
Krylov space has no components in the n directions hence the solution built by 
GMRES naturally has no component in the n.

     But with right preconditioning the Krylov space is [s ABs (AB)^2 s, ....] 
We would need to remove B^-1 n from the Krylov space so that (A B) B^-1 n = 0 
In general we don't have any way of applying B^-1 to a vector so we cannot 
create the appropriate "reduced" Krylov space.

     If I run with GMRES (which defaults to left preconditioner) and the 
options ./testPreconditioners -pc_type gamg -ksp_type gmres 
-ksp_monitor_true_residual -ksp_rtol 1.e-12 -ksp_view -mg_coarse_pc_type svd

  Then  it handles the null space correctly and the solution has Solution mean 
= 4.51028e-17

Is there any reason to use FGMRES instead of GMRES? You just cannot use GMRES 
as the smoother inside GAMG if you use GMRES on the outside, but for pressure 
equations you don't want use such a strong smoother anyways.

  Barry

  I feel we should add some information to the documentation on the removal of 
the null space to the user's manual when using right preconditioning and maybe 
even have an error check in the code so that people don't fall into this trap. 
But I am not sure exactly what to do. When the A and B are both symmetric I 
think special stuff happens that doesn't require providing a null space; but I 
am not sure.





On Mar 21, 2022, at 12:41 PM, Marco Cisternino 
<[email protected]<mailto:[email protected]>> wrote:

Thank you, Mark.
However, doing this with my toy code
mpirun -n 1 ./testpreconditioner -pc_type gamg 
-pc_gamg_use_parallel_coarse_grid_solver -mg_coarse_pc_type jacobi 
-mg_coarse_ksp_type cg

I get 16 inf elements. Do I miss anything?

Thanks again

Marco Cisternino


From: Mark Adams <[email protected]<mailto:[email protected]>>
Sent: lunedì 21 marzo 2022 17:31
To: Marco Cisternino 
<[email protected]<mailto:[email protected]>>
Cc: [email protected]<mailto:[email protected]>
Subject: Re: [petsc-users] Null space and preconditioners

And for GAMG you can use:

-pc_gamg_use_parallel_coarse_grid_solver -mg_coarse_pc_type jacobi 
-mg_coarse_ksp_type cg

Note if you are using more that one MPI process you can use 'lu' instead of 
'jacobi'

If GAMG converges fast enough it can solve before the constant creeps in and 
works without cleaning in the KSP method.

On Mon, Mar 21, 2022 at 12:06 PM Mark Adams 
<[email protected]<mailto:[email protected]>> wrote:
The solution for Neumann problems can "float away" if the constant is not 
controlled in some way because floating point errors can introduce it even if 
your RHS is exactly orthogonal to it.

You should use a special coarse grid solver for GAMG but it seems to be working 
for you.

I have lost track of the simply way to have the KSP solver clean the constant 
out, which is what you want.

can someone help Marco?

Mark





On Mon, Mar 21, 2022 at 8:18 AM Marco Cisternino 
<[email protected]<mailto:[email protected]>> wrote:
Good morning,
I’m observing an unexpected (to me) behaviour of my code.
I tried to reduce the problem in a toy code here attached.
The toy code archive contains a small main, a matrix and a rhs.
The toy code solves the linear system and check the norms and the mean of the 
solution.
The problem into the matrix and the rhs is the finite volume discretization of 
the pressure equation of an incompressible NS solver.
It has been cooked as tiny as possible (16 cells!).
It is important to say that it is an elliptic problem with homogeneous Neumann 
boundary conditions only, for this reason the toy code sets a null space 
containing the constant.

The unexpected (to me) behaviour is evident by launching the code using 
different preconditioners, using -pc-type <pctype>
I tested using PCNONE (“none”), PCGAMG (“gamg”) and PCILU (“ilu”). The default 
solver is KSPFGMRES.
Using the three PC, I get 3 different solutions. It seems to me that they 
differ in the mean value, but GAMG is impressive.
PCNONE gives me the zero mean solution I expected. What about the others?

Asking for residuals monitor, the ratio ||r||/||b|| shows convergence for 
PCNONE and PCILU (~10^-16), but it stalls for PCGAMG (~10^-4).
I cannot see why. Am I doing anything wrong or incorrectly thinking about the 
expected behaviour?

Generalizing to larger mesh the behaviour is similar.

Thank you for any help.

Marco Cisternino



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>

Reply via email to