> On Aug 6, 2021, at 5:26 PM, Alfredo J Duarte Gomez <aduar...@utexas.edu> 
> wrote:
> 
> Good morning,
> 
> I am currently working on a PETSC application that will require a 
> preconditioner that uses several block matrices.
> 
> For now, I have a simple problem that I am solving with a dmda structured 
> grid with two fields. For presentation purposes (I know petsc does not use 
> this ordering), lets assume a vector ordering [u1,u2,...,uN,v1,v2,...vN] 
> where u and v are my two fields with N number of grid points. The coupling 
> between these two fields is weak enough that an efficient preconditioner can 
> be formed as the matrix P = [A1, 0;0,A2] where A1 (dependent on u only) and 
> A2 (dependent on v only) are block matrices of size NxN. Therefore, I only 
> require two linear solves of the reduced systems.
> 
> I am passing the preconditioner matrix P in the Jacobian function, and I hope 
> this strategy is what I am telling PETSC to do with the following block of 
> code:
> 
> ierr = KSPGetPC(snes,&pc);CHKERRQ(ierr);CHKERRQ(ierr);
> ierr = PCSetType(pc,PCFIELDSPLIT);CHKERRQ(ierr);
> ierr =  DMCreateFieldDecomposition(dau,NULL,NULL,&fields,NULL);CHKERRQ(ierr);
> ierr =  PCFieldSplitSetIS(pc,NULL,fields[0]);CHKERRQ(ierr);
> ierr =  PCFieldSplitSetIS(pc,NULL,fields[1]);CHKERRQ(ierr);
> 
> Is this what is actually happening, or is the split also including some of 
> the zero blocks on P? 

You should also use PCFieldSplitSetType(pc,PC_COMPOSITE_ADDITIVE) then the 
preconditioned problem will look like

[   KSPSolve(A1) ;  0       ]  [ A1   A12 ]    
[     0 ;      KSPSolve(A2) ]  [ A21 A22 ]                                      
         

in other words the preconditioner is 
                                     [A1    0 ]
approximate inverse (                )
                                    [ 0     A2 ]

the computation is done efficiently and never uses the zero blocks. 

The default PCFieldSplitType is PC_COMPOSITE_MULTIPLICATIVE where the 
preconditioner system looks like
 
                                     [A1         0 ]
approximate inverse (                      )
                                    [ A12     A2 ]

The preconditioner is applied efficiently by first (approximately) solving with 
A1, then applying A12 to that (approximate) solution, removing it from the 
right hand side of the second block and then (approximately) solving with A2. 

Note that if you are using DMDA you don't need to write the above code you can 
use -pc_type fieldsplit -pc_fieldsplit_type additive and it will use the fields 
as you would like.


> 
> Second, for a future application, I will need a slightly more complicated 
> strategy. It will require solving a similar matrix to P as specified above 
> with more fields (Block diagonal for the fields), and then using the answer 
> to those independent systems for a smaller local solves. In summary, if i 
> have M fields and N grid points, I will solve M systems of size N then 
> followed by using solution as the right hand side to solve N systems of size 
> M.

    It sounds like a block diagonal preconditioner (with one block per field) 
in one ordering then changing the ordering and doing another block diagonal 
preconditioner with one block for each grid point. PCFIELDSPLIT cannot do this 
since it basically works with a single ordering.

  You might be able to combine multiple preconditioners using PCCOMPOSITE that 
does a PCFIELDSPLIT  then a PCPBJACOBI. Again you have a choice of additive or 
multiplicative formulation. You should not need to use PCSHELL. In fact you 
should not have to write any code, you should be able to control it completely 
from the options database with, maybe,  -pc_type composite -pc_composite_pcs 
fieldsplit,pbjacobi -pc_composite_type additive

You can also control the solvers used on the inner solves from the options 
database. If you run with -ksp_view it will show the options prefix for each 
inner solve and you can use them to control the fields solvers, for example, 
using gamg for one of the fields in the PCFIELDSPLIT.


> 
> Is this something that the PCFIELDSPLIT can accomodate? Or will I have to 
> implement my own PCSHELL?
> 
> Thank you,
> 
> -Alfredo
> 
> -- 
> Alfredo Duarte
> Graduate Research Assistant
> The University of Texas at Austin

Reply via email to