I have another question regarding MatCreateRedundantMatrix and MPICreateSubMatricesMPI. The former works for MPIAIJ and MPIDENSE and the later only for MPIAIJ. Would it be possible to use MatCreateRedundantMatrix with a factored matrix and MPICreateSubMatricesMPI with dense and/or elemental matric
Colin,
1) What equations are you solving?
2) In your second case, you set the outer ksp to preonly, thus we are
unable to see the ksp_monitor for the (firedrake_0_) solver. Set it to
gmres and see if you have a similar output to your first case:
0 KSP preconditioned resid norm 4.985448866758e+00
Got it. I could use -pc_pc_side to get the same with gmres: now it makes sense.
Thanks
Simone
From: Matthew Knepley
Sent: Monday, March 18, 2019 3:58:39 PM
To: Rossi, Simone
Cc: Justin Chang; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] BJACOBI with FIELDSPLI
Sorry, just to clarify, in the second case I see several *inner* iterations,
even though I'm using LU on a supposedly exact Schur complement as the
preconditioner for the Schur system.
From: petsc-users on behalf of Cotter, Colin
J via petsc-users
Sent: 18 Mar
Dear petsc-users,
I'm solving a 2x2 block system, for which I can construct the Schur
complement analytically (through compatible FEM stuff),
which I can pass as the preconditioning matrix.
When using gmres on the outer iteration, and preonly+lu on the inner iterations
with a Schur compleme
On Mon, Mar 18, 2019 at 3:56 PM Rossi, Simone via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> To follow up on that: when would you want to use gmres instead of fgmres
> in the outer ksp?
>
The difference here is just that FGMRES is right-preconditioned by default,
so you do not get the extra
On Mon, Mar 18, 2019 at 3:18 PM Rossi, Simone via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Dear all,
>
> I'm debugging my application in which I'm trying to use the FIELDSPLIT
> preconditioner for solving a 2x2 block matrix.
>
>
> Currently I'm testing the preconditioner on a decoupled syst
To follow up on that: when would you want to use gmres instead of fgmres in the
outer ksp?
Thanks again for the help,
Simone
From: Rossi, Simone
Sent: Monday, March 18, 2019 3:43:04 PM
To: Justin Chang
Cc: Smith, Barry F.; petsc-users@mcs.anl.gov
Subject: Re: [
Thanks, using fgmres it does work as expected.
I thought gmres would do the same since I'm solving the subblocks "exactly".
Simone
From: Justin Chang
Sent: Monday, March 18, 2019 3:38:34 PM
To: Rossi, Simone
Cc: Smith, Barry F.; petsc-users@mcs.anl.gov
Subject
Use -ksp_type fgmres if your inner ksp solvers are gmres. Maybe that will
help?
On Mon, Mar 18, 2019 at 1:33 PM Rossi, Simone via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Thanks Barry.
>
> Let me know if you can spot anything out of the ksp_view
>
>
> KSP Object: 1 MPI processes
>
> type
Thanks Barry.
Let me know if you can spot anything out of the ksp_view
KSP Object: 1 MPI processes
type: gmres
restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization
with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=5000, nonzero in
Simone,
This is indeed surprising, given the block structure of the matrix and the
exact block solves we'd expect the solver to converge after the application of
the preconditioner. Please send the output of -ksp_view
Barry
Also if you are willing to share your test code we can try
Dear all,
I'm debugging my application in which I'm trying to use the FIELDSPLIT
preconditioner for solving a 2x2 block matrix.
Currently I'm testing the preconditioner on a decoupled system where I solve
two identical and independent Poisson problems. Using the default fieldsplit
type (multi
13 matches
Mail list logo