Hi!

I have a Chorin pressure correction solver with consistent pressure update, i.e.
pressure solve is based on the Schur complement

E = -A10*ainv(A00)*A01

with A10 = divergence, A00 the mass matrix and A01 the gradient.

I have had this implemented with petsc for a long time and it's working fine. 
However, I've done the schur-complement manually, ie using a MatShell.

I now wanted to see if I can implement this using the petsc facilities for the 
schur-complement, but I get a confusing error when I call 
MatSchurComplementGetPmat().

-----

Code snippet:

 MatCreateSchurComplement(m_blocks[0], m_blocks[0], m_blocks[1], m_blocks[2], 
nullptr, &E_operator);
< ... setup the ksp for A00 >
 MatSchurComplementSetAinvType(E_operator, MAT_SCHUR_COMPLEMENT_AINV_DIAG);
MatView(E_operator);
MatSchurComplementGetPmat(E_operator, MAT_INITIAL_MATRIX, &E_pc);

-----

This yields the output (I cut out the matrix elements):
Mat Object: 1 MPI processes
  type: schurcomplement
  Schur complement A11 - A10 inv(A00) A01
  A11 = 0
  A10
    Mat Object: 1 MPI processes
      type: seqaij
 KSP of A00
    KSP Object: 1 MPI processes
      type: preonly
      maximum iterations=1000, initial guess is zero
      tolerances:  relative=1e-06, absolute=1e-20, divergence=1e+06
      left preconditioning
      using DEFAULT norm type for convergence test
    PC Object: 1 MPI processes
      type: lu
        out-of-place factorization
        tolerance for zero pivot 2.22045e-14
        matrix ordering: nd
        factor fill ratio given 5., needed 1.02768
          Factored matrix follows:
            Mat Object: 1 MPI processes
              type: seqaij
              rows=72, cols=72
              package used to perform factorization: petsc
              total: nonzeros=4752, allocated nonzeros=4752
                using I-node routines: found 22 nodes, limit used is 5
      linear system matrix = precond matrix:
      Mat Object: 1 MPI processes
        type: seqaij
        rows=72, cols=72
        total: nonzeros=4624, allocated nonzeros=5184
        total number of mallocs used during MatSetValues calls=0
          using I-node routines: found 24 nodes, limit used is 5
  A01
    Mat Object: 1 MPI processes
      type: seqaij

[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Wrong type of object: Parameter # 1
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.17.2, Jun 02, 2022
[0]PETSC ERROR: ../d/bin/Stokes on a linux-gnu-cxx-opt named akvalung by akva 
Fri Jun  3 14:48:06 2022
[0]PETSC ERROR: Configure options --with-mpi=0 --with-lapack-lib=-llapack 
--with-64-bit-indices=0 --with-shared-libraries=0 --COPTFLAGS=-O3 
--CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-blas-lib=-lblas --CFLAGS=-fPIC 
--CXXFLAGS=-fPIC --FFLAGS=-fPIC
[0]PETSC ERROR: #1 MatDestroy() at 
/home/akva/kode/petsc/petsc-3.17.2/src/mat/interface/matrix.c:1235
[0]PETSC ERROR: #2 MatCreateSchurComplementPmat() at 
/home/akva/kode/petsc/petsc-3.17.2/src/ksp/ksp/utils/schurm/schurm.c:763
[0]PETSC ERROR: #3 MatSchurComplementGetPmat_Basic() at 
/home/akva/kode/petsc/petsc-3.17.2/src/ksp/ksp/utils/schurm/schurm.c:785
[0]PETSC ERROR: #4 MatSchurComplementGetPmat() at 
/home/akva/kode/petsc/petsc-3.17.2/src/ksp/ksp/utils/schurm/schurm.c:835

where the errors come from the call call to obtain the preconditioner matrix.
I don't see what I've done wrong, as far as I can see it's all following 
https://petsc.org/release/docs/manualpages/KSP/MatCreateSchurComplement.html#MatCreateSchurComplement
MatCreateSchurComplement - Argonne National 
Laboratory<https://petsc.org/release/docs/manualpages/KSP/MatCreateSchurComplement.html#MatCreateSchurComplement>
Notes The Schur complement is NOT explicitly formed! Rather, this function 
returns a virtual Schur complement that can compute the matrix-vector product 
by using formula S = A11 - A10 A^{-1} A01 for Schur complement S and a KSP 
solver to approximate the action of A^{-1}.. All four matrices must have the 
same MPI communicator.
petsc.org


and
 
https://petsc.org/release/docs/manualpages/KSP/MatSchurComplementGetPmat.html#MatSchurComplementGetPmat

Looking into the code it seems to try to call MatDestroy() for the Sp matrix 
but as Sp has not been set up it fails (schurm.c:763)
Removing that call as a test, it seems to succeed and I get the same solution 
as I do
with my manual code.

I'm sure I have done something stupid but I cannot see what, so any pointers 
would be appreciated.

cheers

arnem

Reply via email to