By default if SNESSetJacobian() is not called with a function pointer PETSc 
attempts to compute the Jacobian matrix explicitly with finite differences and 
coloring. This doesn't makes sense with a shell matrix. Hence the error message 
below regarding MatFDColoringCreate().

  DMSNESSetJacobianLocal() calls SNESSetJacobian() with a function pointer of 
SNESComputeJacobian_DMLocal() so preventing the error from triggering in your 
code.

  You can provide your own function to SNESSetJacobian() and thus not need to 
call DMSNESSetJacobianLocal(). What you do depends on how you want to record 
the "base" vector that tells your matrix-free multiply routine where the 
Jacobian matrix vector product is being applied, that is J(u)*x. u is the 
"base" vector which is passed to the function provided with SNESSetJacobian().

   Barry


> On Jan 10, 2024, at 6:20 AM, Yi Hu <y...@mpie.de> wrote:
> 
> Thanks for the clarification. It is more clear to me now about the global to 
> local processes after checking the examples, e.g. 
> ksp/ksp/tutorials/ex14f.F90. 
>  
> And for using Vec locally, I followed your advice of VecGet.. and VecRestore… 
> In fact I used DMDAVecGetArrayReadF90() and some other relevant subroutines. 
>  
> For your comment on DMSNESSetJacobianLocal(). It seems that I need to use 
> both SNESSetJacobian() and then DMSNESSetJacobianLocal() to get things 
> working. When I do only SNESSetJacobian(), it does not work, meaning the 
> following does not work
>  
> …… 
>   call 
> DMDASNESsetFunctionLocal(DM_mech,INSERT_VALUES,formResidual,PETSC_NULL_SNES,err_PETSc)
>        
>   CHKERRQ(err_PETSc)
>   call MatCreateShell(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,&
>                       
> 9*product(cells(1:2))*cells3,9*product(cells(1:2))*cells3,&
>                       0,Jac_PETSc,err_PETSc)
>   CHKERRQ(err_PETSc)
>   call MatShellSetOperation(Jac_PETSc,MATOP_MULT,GK_op,err_PETSc)
>   CHKERRQ(err_PETSc)
>   call 
> SNESSetJacobian(SNES_mech,Jac_PETSc,Jac_PETSc,PETSC_NULL_FUNCTION,0,err_PETSc)
>   CHKERRQ(err_PETSc)
>   !call DMSNESsetJacobianLocal(DM_mech,formJacobian,PETSC_NULL_SNES,err_PETSc)
>   !CHKERRQ(err_PETSc)
>   call 
> SNESsetConvergenceTest(SNES_mech,converged,PETSC_NULL_SNES,PETSC_NULL_FUNCTION,err_PETSc)
>     
>   CHKERRQ(err_PETSc)
>   call SNESSetDM(SNES_mech,DM_mech,err_PETSc)
>   CHKERRQ(err_PETSc)
> ……
>  
> It gives me the message 
> [0]PETSC ERROR: No support for this operation for this object type            
>                           
> [0]PETSC ERROR: Code not yet written for matrix type shell                    
>                           
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.      
>                           
> [0]PETSC ERROR: Petsc Release Version 3.16.4, Feb 02, 2022
> [0]PETSC ERROR: Configure options PETSC_ARCH=linux-gnu 
> --with-fortran-bindings --with-mpi-f90module-visibility=0 --download-fftw 
> --download-hdf5 --download-hdf5-fortran-bindings --download-fblaslapack 
> --download-ml --download-zlib                                                 
>                                  
> [0]PETSC ERROR: #1 MatFDColoringCreate() at 
> /home/yi/app/petsc-3.16.4/src/mat/matfd/fdmatrix.c:471      
> [0]PETSC ERROR: #2 SNESComputeJacobian_DMDA() at 
> /home/yi/app/petsc-3.16.4/src/snes/utils/dmdasnes.c:173[0]PETSC ERROR: #3 
> SNESComputeJacobian() at 
> /home/yi/app/petsc-3.16.4/src/snes/interface/snes.c:2864    
> [0]PETSC ERROR: #4 SNESSolve_NEWTONLS() at 
> /home/yi/app/petsc-3.16.4/src/snes/impls/ls/ls.c:222         
> [0]PETSC ERROR: #5 SNESSolve() at 
> /home/yi/app/petsc-3.16.4/src/snes/interface/snes.c:4809              
> [0]PETSC ERROR: #6 User provided function() at User file:0                    
>                           
> [0]PETSC ERROR: #7 VecSetErrorIfLocked() at 
> /home/yi/app/petsc-3.16.4/include/petscvec.h:623            
> [0]PETSC ERROR: #8 VecGetArray() at 
> /home/yi/app/petsc-3.16.4/src/vec/vec/interface/rvector.c:1769      
> [0]PETSC ERROR: #9 User provided function() at User file:0                    
>                           
> [0]PETSC ERROR: #10 MatFDColoringCreate() at 
> /home/yi/app/petsc-3.16.4/src/mat/matfd/fdmatrix.c:471     
> [0]PETSC ERROR: #11 SNESComputeJacobian_DMDA() at 
> /home/yi/app/petsc-3.16.4/src/snes/utils/dmdasnes.c:173                       
>                                                                               
>   
> [0]PETSC ERROR: #12 SNESComputeJacobian() at 
> /home/yi/app/petsc-3.16.4/src/snes/interface/snes.c:2864   
> [0]PETSC ERROR: #13 SNESSolve_NEWTONLS() at 
> /home/yi/app/petsc-3.16.4/src/snes/impls/ls/ls.c:222        
> [0]PETSC ERROR: #14 SNESSolve() at 
> /home/yi/app/petsc-3.16.4/src/snes/interface/snes.c:4809
>  
> It seems that I have to use a DMSNESSetJacobianLocal() to “activate” the use 
> of my shell matrix, although the formJacobian() in the 
> DMSNESSetJacobianLocal() is doing nothing. 
>  
> Best wishes,
> Yi
>  
>  
>  
> From: Barry Smith <bsm...@petsc.dev <mailto:bsm...@petsc.dev>> 
> Sent: Tuesday, January 9, 2024 4:49 PM
> To: Yi Hu <y...@mpie.de <mailto:y...@mpie.de>>
> Cc: petsc-users <petsc-users@mcs.anl.gov <mailto:petsc-users@mcs.anl.gov>>
> Subject: Re: [petsc-users] SNES seems not use my matrix-free operation
>  
> However, my GK_op (which is the reloaded MATOP_MULT) gives me some problem. 
> It is entered but then crashed with Segmentation Violation error. So I guess 
> my indices may be wrong. I wonder do I need to use the local Vec (of dF), and 
> should my output Vec also be in the correct shape (i.e. after calculation I 
> need to transform back into a Vec)? As you can see here, my dF is a tensor 
> defined on every grid point. 
>  
>  
>    The input for the matrix-vector product is a global vector, as is the 
> result. (Not like the arguments to DMSNESSetJacobianLocal).
>  
>     This means that your MATOP_MULT function needs to do the 
> DMGlobalToLocal() vector operation first then the "unwrapping" from the 
> vector to the array format at the beginning of the routine. Similarly it 
> needs to "unwrap" the result vector as an array.  See 
> src/snes/tutorials/ex14f.F90 and in particular the code block
>  
>       PetscCall(DMGlobalToLocalBegin(da,X,INSERT_VALUES,localX,ierr))
>       PetscCall(DMGlobalToLocalEnd(da,X,INSERT_VALUES,localX,ierr))
>  
> !  Get pointers to vector data
>  
>       PetscCall(VecGetArrayReadF90(localX,xx,ierr))
>       PetscCall(VecGetArrayF90(F,ff,ierr))
>  
>   Barry
>  
> You really shouldn't be using DMSNESSetJacobianLocal() for your code. 
> Basically all the DMSNESSetJacobianLocal() gives you is that it automatically 
> handles the global to local mapping and unwrapping of the vector to an array, 
> but it doesn't work for shell matrices.
>  
> 
> 
> On Jan 9, 2024, at 6:30 AM, Yi Hu <y...@mpie.de <mailto:y...@mpie.de>> wrote:
>  
> Dear Barry,
>  
> Thanks for your help. 
>  
> It works when doing first SNESSetJacobian() with my created shell matrix Jac 
> in the main (or module) and then DMSNESSetJacobianLocal() to associate with 
> my DM and an dummy formJacobian callback (which is doing nothing). My SNES 
> can now recognize my shell matrix and do my customized operation.
>  
> However, my GK_op (which is the reloaded MATOP_MULT) gives me some problem. 
> It is entered but then crashed with Segmentation Violation error. So I guess 
> my indices may be wrong. I wonder do I need to use the local Vec (of dF), and 
> should my output Vec also be in the correct shape (i.e. after calculation I 
> need to transform back into a Vec)? As you can see here, my dF is a tensor 
> defined on every grid point. 
>  
> Best wishes,
> Yi
>  
> From: Barry Smith <bsm...@petsc.dev <mailto:bsm...@petsc.dev>> 
> Sent: Monday, January 8, 2024 6:41 PM
> To: Yi Hu <y...@mpie.de <mailto:y...@mpie.de>>
> Cc: petsc-users@mcs.anl.gov <mailto:petsc-users@mcs.anl.gov>
> Subject: Re: [petsc-users] SNES seems not use my matrix-free operation
>  
>  
>    "formJacobian" should not be __creating__ the matrices. Here "form" means 
> computing the numerical values in the matrix (or when using a shell matrix it 
> means keeping a copy of X so that your custom matrix-free multiply knows the 
> base location where the matrix free Jacobian-vector products are computed.)
>  
>    You create the shell matrices up in your main program and pass them in 
> with SNESSetJacobian(). 
>  
>     Try first calling SNESSetJacobian() to provide the matrices (provide a 
> dummy function argument) and then call DMSNESSetJacobianLocal() to provide 
> your "formjacobian"  function (that does not create the matrices).
>  
>    Barry
>  
>  
>    Yes, "form" is a bad word that should not have been used in our code.
>  
>  
> 
> 
> 
> On Jan 8, 2024, at 12:24 PM, Yi Hu <y...@mpie.de <mailto:y...@mpie.de>> wrote:
>  
> Dear PETSc Experts,
>  
> I am implementing a matrix-free jacobian for my SNES solver in Fortran. 
> (command line option -snes_type newtonls -ksp_type gmres)
>  
> In the main program, I define my residual and jacobian and matrix-free 
> jacobian like the following,
>  
> …
> call DMDASNESSetFunctionLocal(DM_mech, INSERT_VALUES, formResidual, 
> PETSC_NULL_SNES, err_PETSc)
> call DMSNESSetJacobianLocal(DM_mech, formJacobian, PETSC_NULL_SNES, err_PETSc)
> …
>  
> subroutine formJacobian(residual_subdomain,F,Jac_pre,Jac,dummy,err_PETSc)
>   
> #include <petsc/finclude/petscmat.h>
>   use petscmat
>   implicit None
>   DMDALocalInfo, dimension(DMDA_LOCAL_INFO_SIZE) :: &
>     residual_subdomain                                                        
>                       !< DMDA info (needs to be named "in" for macros like 
> XRANGE to work)
>   real(pREAL), dimension(3,3,cells(1),cells(2),cells3), intent(in) :: &
>     F                                                                         
>                       !< deformation gradient field
>   Mat                                  :: Jac, Jac_pre
>   PetscObject                          :: dummy
>   PetscErrorCode                       :: err_PETSc
>   PetscInt                             :: N_dof ! global number of DoF, maybe 
> only a placeholder
>  
>   N_dof = 9*product(cells(1:2))*cells3 
>  
>   print*, 'in my jac'
>   
>   call 
> MatCreateShell(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,N_dof,N_dof,0,Jac,err_PETSc)
>   CHKERRQ(err_PETSc)
>   call MatShellSetOperation(Jac,MATOP_MULT,GK_op,err_PETSc)
>   CHKERRQ(err_PETSc)
>   
>   print*, 'in my jac'
>  
>   ! for jac preconditioner
>   call 
> MatCreateShell(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,N_dof,N_dof,0,Jac_pre,err_PETSc)
>   CHKERRQ(err_PETSc)
>   call MatShellSetOperation(Jac_pre,MATOP_MULT,GK_op,err_PETSc)
>   CHKERRQ(err_PETSc)
>  
>   print*, 'in my jac'
>  
> end subroutine formJacobian
>  
> subroutine GK_op(Jac,dF,output,err_PETSc)
>   real(pREAL), dimension(3,3,cells(1),cells(2),cells3), intent(in) :: &
>     dF                                                                        
>                        
>   real(pREAL), dimension(3,3,cells(1),cells(2),cells3), intent(out) :: &
>     output                                                                    
>                            
>   real(pREAL),  dimension(3,3) :: &
>     deltaF_aim = 0.0_pREAL
>  
>   Mat                                  :: Jac
>   PetscErrorCode                       :: err_PETSc
>  
>   integer :: i, j, k, e
>  
>   … a lot of calculations …
>  
>   print*, 'in GK op'
>   
> end subroutine GK_op
>  
> The first question is that: it seems I still need to explicitly define the 
> interface of MatCreateShell() and MatShellSetOperation() to properly use 
> them, even though I include them via “use petscmat”. It is a little bit 
> strange to me, since some examples do not perform this step. 
>  
> Then the main issue is that I can build my own Jacobian from my call back 
> function formJacobian, and confirm my Jacobian is a shell matrix (by 
> MatView). However, my customized operator GK_op is not called when solving 
> the nonlinear system (not print my “in GK op”). When I try to monitor my 
> SNES, it gives me some conventional output not mentioning my matrix-free 
> operations. So I guess my customized MATOP_MULT may be not associated with 
> Jacobian. Or my configuration is somehow wrong. Could you help me solve this 
> issue? 
>  
> Thanks,
> Yi
>  
> 
> 
> 
> -------------------------------------------------
> Stay up to date and follow us on LinkedIn, Twitter and YouTube.
> 
> Max-Planck-Institut für Eisenforschung GmbH
> Max-Planck-Straße 1
> D-40237 Düsseldorf
>  
> Handelsregister B 2533 
> Amtsgericht Düsseldorf
>  
> Geschäftsführung
> Prof. Dr. Gerhard Dehm
> Prof. Dr. Jörg Neugebauer
> Prof. Dr. Dierk Raabe
> Dr. Kai de Weldige
>  
> Ust.-Id.-Nr.: DE 11 93 58 514 
> Steuernummer: 105 5891 1000
> 
> 
> Please consider that invitations and e-mails of our institute are 
> only valid if they end with …@mpie.de. 
> If you are not sure of the validity please contact r...@mpie.de 
> <mailto:r...@mpie.de>
> 
> Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
> aus unserem Haus nur mit der Endung …@mpie.de gültig sind. 
> In Zweifelsfällen wenden Sie sich bitte an r...@mpie.de <mailto:r...@mpie.de>
> -------------------------------------------------
>  
> 
> 
> -------------------------------------------------
> Stay up to date and follow us on LinkedIn, Twitter and YouTube.
> 
> Max-Planck-Institut für Eisenforschung GmbH
> Max-Planck-Straße 1
> D-40237 Düsseldorf
>  
> Handelsregister B 2533 
> Amtsgericht Düsseldorf
>  
> Geschäftsführung
> Prof. Dr. Gerhard Dehm
> Prof. Dr. Jörg Neugebauer
> Prof. Dr. Dierk Raabe
> Dr. Kai de Weldige
>  
> Ust.-Id.-Nr.: DE 11 93 58 514 
> Steuernummer: 105 5891 1000
> 
> 
> Please consider that invitations and e-mails of our institute are 
> only valid if they end with …@mpie.de. 
> If you are not sure of the validity please contact r...@mpie.de 
> <mailto:r...@mpie.de>
> 
> Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
> aus unserem Haus nur mit der Endung …@mpie.de gültig sind. 
> In Zweifelsfällen wenden Sie sich bitte an r...@mpie.de <mailto:r...@mpie.de>
> -------------------------------------------------
>  
> 
> 
> -------------------------------------------------
> Stay up to date and follow us on LinkedIn, Twitter and YouTube.
> 
> Max-Planck-Institut für Eisenforschung GmbH
> Max-Planck-Straße 1
> D-40237 Düsseldorf
>  
> Handelsregister B 2533 
> Amtsgericht Düsseldorf
>  
> Geschäftsführung
> Prof. Dr. Gerhard Dehm
> Prof. Dr. Jörg Neugebauer
> Prof. Dr. Dierk Raabe
> Dr. Kai de Weldige
>  
> Ust.-Id.-Nr.: DE 11 93 58 514 
> Steuernummer: 105 5891 1000
> 
> 
> Please consider that invitations and e-mails of our institute are 
> only valid if they end with …@mpie.de. 
> If you are not sure of the validity please contact r...@mpie.de 
> <mailto:r...@mpie.de>
> 
> Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
> aus unserem Haus nur mit der Endung …@mpie.de gültig sind. 
> In Zweifelsfällen wenden Sie sich bitte an r...@mpie.de <mailto:r...@mpie.de>
> -------------------------------------------------

Reply via email to