Interesting! And would fit into configuring PFLOTRAN via its input decks
(ie we could also provide ASM instead of Block Jacobi)

Thanks a lot!


On 28/06/17 17:31, Barry Smith wrote:
>> On Jun 28, 2017, at 2:07 AM, Robert Annewandter 
>> <robert.annewand...@opengosim.com> wrote:
>>
>> Thank you Barry!
>>
>> We like to hard wire it into PFLOTRAN with CPR-AMG Block Jacobi Two-Stage 
>> Preconditioning potentially becoming the standard solver strategy. 
>    Understood. Note that you can embed the options into the program with 
> PetscOptionsSetValue() so they don't need to be on the command line.
>
>     Barry
>
>> Using the options database is a great start to reverse engineer the issue!
>>
>> Thanks!
>> Robert
>>
>>
>>
>>
>> On 27/06/17 23:45, Barry Smith wrote:
>>>    It is difficult, if not impossible at times to get all the options where 
>>> you want them to be using the function call interface. On the other hand it 
>>> is generally easy (if there are no inner PCSHELLS) to do this via the 
>>> options database
>>>
>>>    -pc_type composite
>>>    -pc_composite_type multiplicative
>>>    -pc_composite_pcs galerkin,bjacobi
>>>
>>>    -sub_0_galerkin_ksp_type preonly
>>>    -sub_0_galerkin_pc_type none
>>>
>>>    -sub_1_sub_pc_factor_shift_type inblocks
>>>    -sub_1_sub_pc_factor_zero_pivot zpiv
>>>
>>>
>>>
>>>
>>>> On Jun 27, 2017, at 11:24 AM, Robert Annewandter 
>>>> <robert.annewand...@opengosim.com>
>>>>  wrote:
>>>>
>>>> Dear PETSc folks,
>>>>
>>>>
>>>> I want a Block Jacobi PC to be the second PC in a two-stage 
>>>> preconditioning scheme implemented via multiplicative PCCOMPOSITE, with 
>>>> the outermost KSP an FGMRES.
>>>>
>>>>
>>>> However, PCBJacobiGetSubKSP (
>>>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCBJacobiGetSubKSP.html#PCBJacobiGetSubKSP
>>>> ) requires to call KSPSetUp (or PCSetUp) first on its parent KSP, which I 
>>>> struggle in succeeding. I wonder which KSP (or if so PC) that is.  
>>>>
>>>>
>>>> This is how I attempt to do it (using PCKSP to provide a parent KSP for 
>>>> PCBJacobiGetSubKSP):
>>>>
>>>>
>>>> call KSPGetPC(solver%ksp, solver%pc, ierr); CHKERRQ(ierr)
>>>> call PCSetType(solver%pc, PCCOMPOSITE, ierr); CHKERRQ(ierr)
>>>> call PCCompositeSetType(solver%pc, PC_COMPOSITE_MULTIPLICATIVE, ierr); 
>>>> CHKERRQ(ierr)
>>>>
>>>>
>>>> ! 1st Stage 
>>>> call PCCompositeAddPC(solver%pc, PCGALERKIN, ierr); CHKERRQ(ierr)
>>>> call PCCompositeGetPC(solver%pc, 0, T1, ierr); CHKERRQ(ierr)
>>>>
>>>>
>>>> ! KSPPREONLY-PCNONE for testing
>>>> call PCGalerkinGetKSP(T1, Ap_ksp, ierr); CHKERRQ(ierr)
>>>> call KSPSetType(Ap_ksp, KSPPREONLY, ierr); CHKERRQ(ierr)
>>>> call KSPGetPC(Ap_ksp, Ap_pc, ierr); CHKERRQ(ierr)
>>>> call PCSetType(Ap_pc, PCNONE, ierr); CHKERRQ(ierr)
>>>>
>>>>
>>>> ! 2nd Stage
>>>> call PCCompositeAddPC(solver%pc, PCKSP, ierr); CHKERRQ(ierr)
>>>> call PCCompositeGetPC(solver%pc, 1, T2, ierr); CHKERRQ(ierr)
>>>> call PCKSPGetKSP(T2, BJac_ksp, ierr); CHKERRQ(ierr)
>>>> call KSPSetType(BJac_ksp, KSPPREONLY, ierr); CHKERRQ(ierr)
>>>> call KSPGetPC(BJac_ksp, BJac_pc, ierr); CHKERRQ(ierr)
>>>> call PCSetType(BJac_pc, PCBJACOBI, ierr); CHKERRQ(ierr)
>>>>
>>>>
>>>> call KSPSetUp(solver%ksp, ierr); CHKERRQ(ierr)
>>>> ! call KSPSetUp(BJac_ksp, ierr); CHKERRQ(ierr)
>>>> ! call PCSetUp(T2, ierr); CHKERRQ(ierr)
>>>> ! call PCSetUp(BJac_pc, ierr); CHKERRQ(ierr)
>>>>
>>>>
>>>> call PCBJacobiGetSubKSP(BJac_pc, nsub_ksp, first_sub_ksp, PETSC_NULL_KSP, 
>>>> ierr); CHKERRQ(ierr)
>>>> allocate(sub_ksps(nsub_ksp))
>>>> call PCBJacobiGetSubKSP(BJac_pc, nsub_ksp, first_sub_ksp, sub_ksps,ierr); 
>>>> CHKERRQ(ierr)
>>>> do i = 1, nsub_ksp
>>>>   call KSPGetPC(sub_ksps(i), BJac_pc_sub, ierr); CHKERRQ(ierr)
>>>>   call PCFactorSetShiftType(BJac_pc_sub, MAT_SHIFT_INBLOCKS, ierr); 
>>>> CHKERRQ(ierr)
>>>>   call PCFactorSetZeroPivot(BJac_pc_sub, solver%linear_zero_pivot_tol, 
>>>> ierr); CHKERRQ(ierr)
>>>> end do
>>>> deallocate(sub_ksps)
>>>> nullify(sub_ksps)
>>>>
>>>>
>>>> Is using PCKSP a good idea at all? 
>>>>
>>>>
>>>> With KSPSetUp(solver%ksp) -> FGMRES
>>>>
>>>> [0]PETSC ERROR: --------------------- Error Message 
>>>> --------------------------------------------------------------
>>>> [0]PETSC ERROR: Object is in wrong state
>>>> [0]PETSC ERROR: You requested a vector from a KSP that cannot provide one
>>>> [0]PETSC ERROR: See 
>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html
>>>>  for trouble shooting.
>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.7.5-3167-g03c0fad  GIT 
>>>> Date: 2017-03-30 14:27:53 -0500
>>>> [0]PETSC ERROR: pflotran on a debug_g-6.2 named mother by pujjad Tue Jun 
>>>> 27 16:55:14 2017
>>>> [0]PETSC ERROR: Configure options --download-mpich=yes --download-hdf5=yes 
>>>> --download-fblaslapack=yes --download-metis=yes --download-parmetis=yes 
>>>> --download-eigen=yes --download-hypre=yes --download-superlu_dist=yes 
>>>> --download-superlu=yes --with-cc=gcc-6 --with-cxx=g++-6 
>>>> --with-fc=gfortran-6 PETSC_ARCH=debug_g-6.2 
>>>> PETSC_DIR=/home/pujjad/Repositories/petsc
>>>> [0]PETSC ERROR: #1 KSPCreateVecs() line 939 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/ksp/interface/iterativ.c
>>>> [0]PETSC ERROR: #2 KSPSetUp_GMRES() line 85 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/ksp/impls/gmres/gmres.c
>>>> [0]PETSC ERROR: #3 KSPSetUp_FGMRES() line 41 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c
>>>> [0]PETSC ERROR: #4 KSPSetUp() line 338 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/ksp/interface/itfunc.c
>>>> application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0
>>>> [mpiexec@mother] handle_pmi_cmd (./pm/pmiserv/pmiserv_cb.c:52): 
>>>> Unrecognized PMI command: abort | cleaning up processes
>>>> [mpiexec@mother] control_cb (./pm/pmiserv/pmiserv_cb.c:289): unable to 
>>>> process PMI command
>>>> [mpiexec@mother] HYDT_dmxu_poll_wait_for_event 
>>>> (./tools/demux/demux_poll.c:77): callback returned error status
>>>> [mpiexec@mother] HYD_pmci_wait_for_completion 
>>>> (./pm/pmiserv/pmiserv_pmci.c:181): error waiting for event
>>>> [mpiexec@mother] main (./ui/mpich/mpiexec.c:405): process manager error 
>>>> waiting for completion
>>>>
>>>>
>>>>
>>>> With KSPSetUp(BJac_ksp) -> KSPPREONLY
>>>>
>>>> [0]PETSC ERROR: --------------------- Error Message 
>>>> --------------------------------------------------------------
>>>> [0]PETSC ERROR: Arguments are incompatible
>>>> [0]PETSC ERROR: Both n and N cannot be PETSC_DECIDE
>>>>   likely a call to VecSetSizes() or MatSetSizes() is wrong.
>>>> See 
>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#split
>>>>
>>>> [0]PETSC ERROR: See 
>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html
>>>>  for trouble shooting.
>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.7.5-3167-g03c0fad  GIT 
>>>> Date: 2017-03-30 14:27:53 -0500
>>>> [0]PETSC ERROR: pflotran on a debug_g-6.2 named mother by pujjad Tue Jun 
>>>> 27 16:52:57 2017
>>>> [0]PETSC ERROR: Configure options --download-mpich=yes --download-hdf5=yes 
>>>> --download-fblaslapack=yes --download-metis=yes --download-parmetis=yes 
>>>> --download-eigen=yes --download-hypre=yes --download-superlu_dist=yes 
>>>> --download-superlu=yes --with-cc=gcc-6 --with-cxx=g++-6 
>>>> --with-fc=gfortran-6 PETSC_ARCH=debug_g-6.2 
>>>> PETSC_DIR=/home/pujjad/Repositories/petsc
>>>> [0]PETSC ERROR: #1 PetscSplitOwnership() line 77 in 
>>>> /home/pujjad/Repositories/petsc/src/sys/utils/psplit.c
>>>> [0]PETSC ERROR: #2 PetscLayoutSetUp() line 137 in 
>>>> /home/pujjad/Repositories/petsc/src/vec/is/utils/pmap.c
>>>> [0]PETSC ERROR: #3 VecCreate_Seq_Private() line 847 in 
>>>> /home/pujjad/Repositories/petsc/src/vec/vec/impls/seq/bvec2.c
>>>> [0]PETSC ERROR: #4 VecCreateSeqWithArray() line 899 in 
>>>> /home/pujjad/Repositories/petsc/src/vec/vec/impls/seq/bvec2.c
>>>> [0]PETSC ERROR: #5 PCSetUp_BJacobi_Singleblock() line 786 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c
>>>> [0]PETSC ERROR: #6 PCSetUp_BJacobi() line 136 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c
>>>> [0]PETSC ERROR: #7 PCSetUp() line 924 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/pc/interface/precon.c
>>>> [0]PETSC ERROR: #8 KSPSetUp() line 379 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/ksp/interface/itfunc.c
>>>> [mpiexec@mother] handle_pmi_cmd (./pm/pmiserv/pmiserv_cb.c:52): 
>>>> Unrecognized PMI command: abort | cleaning up processes
>>>> [mpiexec@mother] control_cb (./pm/pmiserv/pmiserv_cb.c:289): unable to 
>>>> process PMI command
>>>> [mpiexec@mother] HYDT_dmxu_poll_wait_for_event 
>>>> (./tools/demux/demux_poll.c:77): callback returned error status
>>>> [mpiexec@mother] HYD_pmci_wait_for_completion 
>>>> (./pm/pmiserv/pmiserv_pmci.c:181): error waiting for event
>>>> [mpiexec@mother] main (./ui/mpich/mpiexec.c:405): process manager error 
>>>> waiting for completion
>>>> [mpiexec@mother] handle_pmi_cmd (./pm/pmiserv/pmiserv_cb.c:52): 
>>>> Unrecognized PMI command: abort | cleaning up processes
>>>> [mpiexec@mother] control_cb (./pm/pmiserv/pmiserv_cb.c:289): unable to 
>>>> process PMI command
>>>> [mpiexec@mother] HYDT_dmxu_poll_wait_for_event 
>>>> (./tools/demux/demux_poll.c:77): callback returned error status
>>>> [mpiexec@mother] HYD_pmci_wait_for_completion 
>>>> (./pm/pmiserv/pmiserv_pmci.c:181): error waiting for event
>>>> [mpiexec@mother] main (./ui/mpich/mpiexec.c:405): process manager error 
>>>> waiting for completion
>>>>
>>>>
>>>>
>>>> With PCSetUp(T2) -> PCKSP
>>>>
>>>> [0]PETSC ERROR: --------------------- Error Message 
>>>> --------------------------------------------------------------
>>>> [0]PETSC ERROR: Object is in wrong state
>>>> [0]PETSC ERROR: Matrix must be set first
>>>> [0]PETSC ERROR: See 
>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html
>>>>  for trouble shooting.
>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.7.5-3167-g03c0fad  GIT 
>>>> Date: 2017-03-30 14:27:53 -0500
>>>> [0]PETSC ERROR: pflotran on a debug_g-6.2 named mother by pujjad Tue Jun 
>>>> 27 16:51:23 2017
>>>> [0]PETSC ERROR: Configure options --download-mpich=yes --download-hdf5=yes 
>>>> --download-fblaslapack=yes --download-metis=yes --download-parmetis=yes 
>>>> --download-eigen=yes --download-hypre=yes --download-superlu_dist=yes 
>>>> --download-superlu=yes --with-cc=gcc-6 --with-cxx=g++-6 
>>>> --with-fc=gfortran-6 PETSC_ARCH=debug_g-6.2 
>>>> PETSC_DIR=/home/pujjad/Repositories/petsc
>>>> [0]PETSC ERROR: #1 PCSetUp() line 888 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/pc/interface/precon.c
>>>> application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0
>>>> [mpiexec@mother] handle_pmi_cmd (./pm/pmiserv/pmiserv_cb.c:52): 
>>>> Unrecognized PMI command: abort | cleaning up processes
>>>> [mpiexec@mother] control_cb (./pm/pmiserv/pmiserv_cb.c:289): unable to 
>>>> process PMI command
>>>> [mpiexec@mother] HYDT_dmxu_poll_wait_for_event 
>>>> (./tools/demux/demux_poll.c:77): callback returned error status
>>>> [mpiexec@mother] HYD_pmci_wait_for_completion 
>>>> (./pm/pmiserv/pmiserv_pmci.c:181): error waiting for event
>>>> [mpiexec@mother] main (./ui/mpich/mpiexec.c:405): process manager error 
>>>> waiting for completion
>>>>
>>>>
>>>>
>>>> With PCSetUp(BJac_pc) -> PCBJACOBI
>>>>
>>>> [0]PETSC ERROR: --------------------- Error Message 
>>>> --------------------------------------------------------------
>>>> [0]PETSC ERROR: Object is in wrong state
>>>> [0]PETSC ERROR: Matrix must be set first
>>>> [0]PETSC ERROR: See 
>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html
>>>>  for trouble shooting.
>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.7.5-3167-g03c0fad  GIT 
>>>> Date: 2017-03-30 14:27:53 -0500
>>>> [0]PETSC ERROR: pflotran on a debug_g-6.2 named mother by pujjad Tue Jun 
>>>> 27 16:42:10 2017
>>>> [0]PETSC ERROR: Configure options --download-mpich=yes --download-hdf5=yes 
>>>> --download-fblaslapack=yes --download-metis=yes --download-parmetis=yes 
>>>> --download-eigen=yes --download-hypre=yes --download-superlu_dist=yes 
>>>> --download-superlu=yes --with-cc=gcc-6 --with-cxx=g++-6 
>>>> --with-fc=gfortran-6 PETSC_ARCH=debug_g-6.2 
>>>> PETSC_DIR=/home/pujjad/Repositories/petsc
>>>> [0]PETSC ERROR: #1 PCSetUp() line 888 in 
>>>> /home/pujjad/Repositories/petsc/src/ksp/pc/interface/precon.c
>>>> [mpiexec@mother] handle_pmi_cmd (./pm/pmiserv/pmiserv_cb.c:52): 
>>>> Unrecognized PMI command: abort | cleaning up processes
>>>> [mpiexec@mother] control_cb (./pm/pmiserv/pmiserv_cb.c:289): unable to 
>>>> process PMI command
>>>> [mpiexec@mother] HYDT_dmxu_poll_wait_for_event 
>>>> (./tools/demux/demux_poll.c:77): callback returned error status
>>>> [mpiexec@mother] HYD_pmci_wait_for_completion 
>>>> (./pm/pmiserv/pmiserv_pmci.c:181): error waiting for event
>>>> [mpiexec@mother] main (./ui/mpich/mpiexec.c:405): process manager error 
>>>> waiting for completion
>>>>
>>>>
>>>>
>>>> Grateful for any help!
>>>> Robert
>>>>
>>>>
>>>>

Reply via email to