I hope that I am replying in the correct manner to respond to the "Having 
trouble with basic Fortran-PETSc interoperability" thread I previously started. 
If not, please correct me.

Thank you everyone for your replies, they were very helpful.

I have what appears to be a working example code using the suggested updates, 
specifically the VecGetArray() and VecRestoreArray() and getting the sequential 
vector back from PETSc using the information from the FAQ.  <<Example code 
attached>>

I have a question about this example code to make sure I am writing reasonably 
efficient code. It seems like I have to create an additional PETSc vector 
'out_seq' which will essentially be a copy of the PETSc vector 'v1p' which is 
not the most efficient use of memory. It also seems to me like there isn't a 
way around this additional 'out_seq' vector because there needs a place to 
aggregate the data from the various processes. Is this a reasonable use of 
PETSc or is there a more efficient way? Note, I am trying to interface my 
existing code base with PETSc to use the solvers and this may be the 
performance trade-off for not developing my program fully within the PETSc 
ecosystem.

I have another question only tangentially related to this topic. Should I ask 
it as part of this thread or create a new topic?

Michael
________________________________
From: petsc-users <[email protected]> on behalf of 
[email protected] <[email protected]>
Sent: Saturday, January 3, 2026 1:00 PM
To: [email protected] <[email protected]>
Subject: petsc-users Digest, Vol 205, Issue 2


      External Email - Use Caution



Send petsc-users mailing list submissions to
        [email protected]

To subscribe or unsubscribe via the World Wide Web, visit
        
https://urldefense.us/v3/__https://nam02.safelinks.protection.outlook.com/?url=https*3A*2F*2Flists.mcs.anl.gov*2Fmailman*2Flistinfo*2Fpetsc-users&data=05*7C02*7Cmwhitte6*40jhu.edu*7C094639f3b5f5405cfc4e08de4af1fa67*7C9fa4f438b1e6473b803f86f8aedf0dec*7C0*7C0*7C639030600339697177*7CUnknown*7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ*3D*3D*7C0*7C*7C*7C&sdata=Jh2MHiTm*2Fj3*2Ft6gyDoGp295Ex3SzJAW0iyV7GN*2FDN0o*3D&reserved=0__;JSUlJSUlJSUlJSUlJSUlJSUlJSUlJSUlJSU!!G_uCfscf7eWS!e5Z4bqpGanLUgkl1LVgQhM9FRHLRN-xFXCFpnkMtJ-l38HzJqQ3xHhR2i-mB0G6XbKMBZWzNnqZO2LHglX1FXiiB$
 <https://lists.mcs.anl.gov/mailman/listinfo/petsc-users >
or, via email, send a message with subject or body 'help' to
        [email protected]

You can reach the person managing the list at
        [email protected]

When replying, please edit your Subject line so it is more specific
than "Re: Contents of petsc-users digest..."


Today's Topics:

   1. Re:  Having trouble with basic Fortran-PETSc interoperability
      (Barry Smith)
   2. Re:  Having trouble with basic Fortran-PETSc interoperability
      (Matthew Knepley)


----------------------------------------------------------------------

Message: 1
Date: Fri, 2 Jan 2026 16:33:27 -0500
From: Barry Smith <[email protected]>
To: Michael Whitten <[email protected]>
Cc: "[email protected]" <[email protected]>
Subject: Re: [petsc-users] Having trouble with basic Fortran-PETSc
        interoperability
Message-ID: <[email protected]>
Content-Type: text/plain; charset="utf-8"


  VecGetValues() uses 0 based indexing in both Fortran and C.

   You don't want to use VecGetValues() and VecSetValues() usually since they 
result in two copies of the arrays and copying entire arrays back and forth.

  You can avoid copying between PETSc vectors and your arrays by using 
VecGetArray(), VecGetArrayWrite(), and VecGetArrayRead(). You can also use 
VecCreateMPIWithArray() to create a PETSc vector using your array; for example 
for input to the right hand side of a KSP. These arrays start their indexing 
with the Fortran default of 1.


  Barry



> On Jan 2, 2026, at 2:42?PM, Michael Whitten via petsc-users 
> <[email protected]> wrote:
>
> Hi PETSc mailing list users,
>
> I have managed to install PETSc and run some PETSc examples and little test 
> codes of my own in Fortran. I am now trying to make PETSc work with my 
> existing Fortran code. I have tried to build little test examples of the 
> functionality that I can then incorporate into my larger code base. However, 
> I am having trouble just passing vectors back and forth between PETSc and 
> Fortran.
>
> I have attached a minimum semi-working example that can be compiled with the 
> standard 'Makefile.user'. It throws an error when I try to copy the PETSc 
> vector back to a Fortran vector using VecGetValues(). I get that it can only 
> access values of the array on the local process but how do I fix this? Is 
> this even the right approach?
>
> In the final implementation I want to be able to assemble my matrix and 
> vector, convert them to PETSc data structures, use PETSc to solve, and then 
> convert the solution vector back to Fortran and return. I want to be able to 
> do this with both the linear and nonlinear solvers. It seems like this is 
> what PETSc is, in part, built to do. Is this a reasonable expectation to 
> achieve? Is this a reasonable use case for PETSc?
>
> Thanks in advance for any help you can offer.
>
> best,
> Michael
> <test.F90>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<https://urldefense.us/v3/__https://nam02.safelinks.protection.outlook.com/?url=http*3A*2F*2Flists.mcs.anl.gov*2Fpipermail*2Fpetsc-users*2Fattachments*2F20260102*2Fedfefca2*2Fattachment-0001.html&data=05*7C02*7Cmwhitte6*40jhu.edu*7C094639f3b5f5405cfc4e08de4af1fa67*7C9fa4f438b1e6473b803f86f8aedf0dec*7C0*7C0*7C639030600350658200*7CUnknown*7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ*3D*3D*7C0*7C*7C*7C&sdata=rgN4n0umo9J7Jj50HJVWtGub3AKBXQljLjTM5c4u19I*3D&reserved=0__;JSUlJSUlJSUlJSUlJSUlJSUlJSUlJSUlJSU!!G_uCfscf7eWS!e5Z4bqpGanLUgkl1LVgQhM9FRHLRN-xFXCFpnkMtJ-l38HzJqQ3xHhR2i-mB0G6XbKMBZWzNnqZO2LHglcNG3CPE$
 
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20260102/edfefca2/attachment-0001.html
 >>

------------------------------

Message: 2
Date: Fri, 2 Jan 2026 17:49:30 -0500
From: Matthew Knepley <[email protected]>
To: Michael Whitten <[email protected]>
Cc: "[email protected]" <[email protected]>
Subject: Re: [petsc-users] Having trouble with basic Fortran-PETSc
        interoperability
Message-ID:
        <CAMYG4GmKRM1=cok6jwbo6itwvpkksepeoh1q1cllkgr2qhk...@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

On Fri, Jan 2, 2026 at 2:48?PM Michael Whitten via petsc-users <
[email protected]> wrote:

> Hi PETSc mailing list users,
>
> I have managed to install PETSc and run some PETSc examples and little
> test codes of my own in Fortran. I am now trying to make PETSc work with my
> existing Fortran code. I have tried to build little test examples of the
> functionality that I can then incorporate into my larger code base.
> However, I am having trouble just passing vectors back and forth between
> PETSc and Fortran.
>
> I have attached a minimum semi-working example that can be compiled with
> the standard 'Makefile.user'. It throws an error when I try to copy the
> PETSc vector back to a Fortran vector using VecGetValues(). I get that it
> can only access values of the array on the local process but how do I fix
> this? Is this even the right approach?
>

No. You should just call VecGetArray(), to get back an F90 pointer to the
values. This is much more convenient.


> In the final implementation I want to be able to assemble my matrix and
> vector, convert them to PETSc data structures, use PETSc to solve, and then
> convert the solution vector back to Fortran and return. I want to be able
> to do this with both the linear and nonlinear solvers. It seems like this
> is what PETSc is, in part, built to do. Is this a reasonable expectation to
> achieve? Is this a reasonable use case for PETSc?
>

Yes, that should work getting the array directly.

  Thanks,

     Matt


> Thanks in advance for any help you can offer.
>
> best,
> Michael
>


--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://nam02.safelinks.protection.outlook.com/?url=https*3A*2F*2Furldefense.us*2Fv3*2F__https*3A*2F*2Fwww.cse.buffalo.edu*2F*knepley*2F__*3Bfg!!G_uCfscf7eWS!dmDhrQx1yGPd8y7YN9DUoj7jpohDJracq1zV5hiJ4GBq5ELNqsZZY7ymYloqOdThUhLu3seGNM2xrh36ooFV*24&data=05*7C02*7Cmwhitte6*40jhu.edu*7C094639f3b5f5405cfc4e08de4af1fa67*7C9fa4f438b1e6473b803f86f8aedf0dec*7C0*7C0*7C639030600350676162*7CUnknown*7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ*3D*3D*7C0*7C*7C*7C&sdata=8Uw0xpICKrArvT*2FWBDWc*2B*2FNJ6X9VgHWbSXFPKtK0R7g*3D&reserved=0__;JSUlJSUlJSUlKiUlJSUlJSUlJSUlJSUlJSUlJSUlJSUl!!G_uCfscf7eWS!e5Z4bqpGanLUgkl1LVgQhM9FRHLRN-xFXCFpnkMtJ-l38HzJqQ3xHhR2i-mB0G6XbKMBZWzNnqZO2LHglfXUsZnR$
 
<https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dmDhrQx1yGPd8y7YN9DUoj7jpohDJracq1zV5hiJ4GBq5ELNqsZZY7ymYloqOdThUhLu3seGNM2xrh36ooFV$>
  
<https://urldefense.us/v3/__https://nam02.safelinks.protection.outlook.com/?url=https*3A*2F*2Furldefense.us*2Fv3*2F__http*3A*2F*2Fwww.cse.buffalo.edu*2F*knepley*2F__*3Bfg!!G_uCfscf7eWS!dmDhrQx1yGPd8y7YN9DUoj7jpohDJracq1zV5hiJ4GBq5ELNqsZZY7ymYloqOdThUhLu3seGNM2xrgy-5x4Z*24&data=05*7C02*7Cmwhitte6*40jhu.edu*7C094639f3b5f5405cfc4e08de4af1fa67*7C9fa4f438b1e6473b803f86f8aedf0dec*7C0*7C0*7C639030600350692343*7CUnknown*7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ*3D*3D*7C0*7C*7C*7C&sdata=DbQtrUptuDX88vl*2FuO2cXRuwy4JMHfKDnFDj5mAMF6s*3D&reserved=0__;JSUlJSUlJSUlKiUlJSUlJSUlJSUlJSUlJSUlJSUlJQ!!G_uCfscf7eWS!e5Z4bqpGanLUgkl1LVgQhM9FRHLRN-xFXCFpnkMtJ-l38HzJqQ3xHhR2i-mB0G6XbKMBZWzNnqZO2LHglZ2lM1HK$
  >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<https://urldefense.us/v3/__https://nam02.safelinks.protection.outlook.com/?url=http*3A*2F*2Flists.mcs.anl.gov*2Fpipermail*2Fpetsc-users*2Fattachments*2F20260102*2Fe60566c6*2Fattachment-0001.html&data=05*7C02*7Cmwhitte6*40jhu.edu*7C094639f3b5f5405cfc4e08de4af1fa67*7C9fa4f438b1e6473b803f86f8aedf0dec*7C0*7C0*7C639030600350708618*7CUnknown*7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ*3D*3D*7C0*7C*7C*7C&sdata=ePJIVOq1D6x6W2Lt7tvgtIXIL71CpCJaZpel*2BvG0r1I*3D&reserved=0__;JSUlJSUlJSUlJSUlJSUlJSUlJSUlJSUlJSUl!!G_uCfscf7eWS!e5Z4bqpGanLUgkl1LVgQhM9FRHLRN-xFXCFpnkMtJ-l38HzJqQ3xHhR2i-mB0G6XbKMBZWzNnqZO2LHglWVEY9ul$
 
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20260102/e60566c6/attachment-0001.html
 >>

------------------------------

Subject: Digest Footer

_______________________________________________
petsc-users mailing list
[email protected]
https://urldefense.us/v3/__https://nam02.safelinks.protection.outlook.com/?url=https*3A*2F*2Flists.mcs.anl.gov*2Fmailman*2Flistinfo*2Fpetsc-users&data=05*7C02*7Cmwhitte6*40jhu.edu*7C094639f3b5f5405cfc4e08de4af1fa67*7C9fa4f438b1e6473b803f86f8aedf0dec*7C0*7C0*7C639030600350725058*7CUnknown*7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ*3D*3D*7C0*7C*7C*7C&sdata=Ueb*2F8ra8XpzYQzLTm4qwWVo9lyMGs2P5*2FMD0nA3xsdg*3D&reserved=0__;JSUlJSUlJSUlJSUlJSUlJSUlJSUlJSUlJQ!!G_uCfscf7eWS!e5Z4bqpGanLUgkl1LVgQhM9FRHLRN-xFXCFpnkMtJ-l38HzJqQ3xHhR2i-mB0G6XbKMBZWzNnqZO2LHglU9yvHkS$
 <https://lists.mcs.anl.gov/mailman/listinfo/petsc-users >


------------------------------

End of petsc-users Digest, Vol 205, Issue 2
*******************************************
program main

#include <petsc/finclude/petscksp.h>
    use petscvec
    implicit none

    PetscErrorCode ierr
    PetscInt i
    Vec v1p
    PetscScalar, pointer :: v1ptr(:), v2ptr(:)

    Vec out_seq
    VecScatter ctx

    integer :: n
    real(8), allocatable, target :: v1(:), v2(:)

    n = 10

    allocate(v1(n),v2(n),source=0.0d0)

    v1 = 1.0d0

    PetscCallA(PetscInitialize(ierr))

    !Create PETSc vector
    PetscCallA(VecCreate(PETSC_COMM_WORLD,v1p,ierr))
    PetscCallA(VecSetSizes(v1p,PETSC_DECIDE,n,ierr))
    PetscCallA(VecSetFromOptions(v1p,ierr))

    !Pass Fortran Vector to PETSc using VecGetArray
    !Saves memory, only passing a pointer instead of a copy
    !of the whole array
    PetscCallA(VecGetArray(v1p,v1ptr,ierr))
    v1ptr = v1
    PetscCallA(VecRestoreArray(v1p,v1ptr,ierr))

    !Send PETSc vector to each processor
    PetscCallA(VecAssemblyBegin(v1p,ierr))
    PetscCallA(VecAssemblyEnd(v1p,ierr))


    !Call PETSc operations, e.g. solver, etc.


    !Print PETSc vector, debugging only
    PetscCallA(VecView(v1p,PETSC_VIEWER_STDOUT_WORLD,ierr))

    !Create sequential PETSc vector from parallel processes
    PetscCallA(VecScatterCreateToAll(v1p,ctx,out_seq,ierr))
    PetscCallA(VecScatterBegin(ctx,v1p,out_seq,INSERT_VALUES,SCATTER_FORWARD,ierr))
    PetscCallA(VecScatterEnd(ctx,v1p,out_seq,INSERT_VALUES,SCATTER_FORWARD,ierr))
    PetscCallA(VecScatterDestroy(ctx,ierr))

    !Get sequential vector back from PETSc
    !Using separate array v2 and pointer v2ptr 
    !here to insure data is being moved from 
    !PETSc vector to Fortran array
    PetscCallA(VecGetArray(out_seq,v2ptr,ierr))
    v2 = v2ptr
    PetscCallA(VecRestoreArray(out_seq,v2ptr,ierr))

    !Print Fortran vector
    print *, "v2: ", v2

    !PETSc clean up
    !PetscCallA(VecDestroy(v1p,ierr))
    !PetscCallA(VecDestroy(out_seq,ierr))
    PetscCallA(PetscFinalize(ierr))


end program main

Reply via email to