Re: [petsc-users] Using PETSc GPU backend

2023-06-30 Thread Matthew Knepley
On Fri, Jun 30, 2023 at 1:13 AM Ng, Cho-Kuen via petsc-users < petsc-users@mcs.anl.gov> wrote: > Mark, > > The application code reads in parameters from an input file, where we can > put the PETSc runtime options. Then we pass the options to > PetscInitialize(...). Does that sounds right? > PETSc

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Ngoc Mai Monica Huynh
Hi, I have no problem now in compiling, thank you for providing the Fortran interface. I have a follow up question. When running the code, I get this error, which I’m pretty sure it is related to DMDAGetElements(), since up to that line everything works fine. [0]PETSC ERROR: --

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Matthew Knepley
On Fri, Jun 30, 2023 at 6:47 AM Ngoc Mai Monica Huynh < ngocmaimonica.hu...@unipv.it> wrote: > Hi, > > I have no problem now in compiling, thank you for providing the Fortran > interface. > I have a follow up question. > When running the code, I get this error, which I’m pretty sure it is > relate

[petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Carl-Johan Thore via petsc-users
Hi, I'm trying to run an iterative solver (FGMRES for example) with PCMG as preconditioner. The setup of PCMG is done roughly as in ex42 of the PETSc-tutorials (https://petsc.org/main/src/ksp/ksp/tutorials/ex42.c.html). Since I have many locked degrees-of-freedom I would like to use PCREDISTRI

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Ngoc Mai Monica Huynh
Yes, it compiles and run correctly Monica > On 30 Jun 2023, at 12:50, Matthew Knepley wrote: > > On Fri, Jun 30, 2023 at 6:47 AM Ngoc Mai Monica Huynh > mailto:ngocmaimonica.hu...@unipv.it>> wrote: > Hi, > > I have no problem now in compiling, thank you for providing the Fortran > int

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Matthew Knepley
On Fri, Jun 30, 2023 at 8:38 AM Ngoc Mai Monica Huynh < ngocmaimonica.hu...@unipv.it> wrote: > Yes, it compiles and run correctly > Okay, then we try to alter that example until it looks like your test. One thing is the #include at the top. Do you have that in your code? If Fortran does not find

Re: [petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Barry Smith
ex42.c provides directly the interpolation/restriction needed to move between levels in the loop for (k = 1; k < nlevels; k++) { PetscCall(DMCreateInterpolation(da_list[k - 1], da_list[k], &R, NULL)); PetscCall(PCMGSetInterpolation(pc, k, R)); PetscCall(MatDestroy(&R)); } The

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Ngoc Mai Monica Huynh
Yes, I have the #include at the top of the code. Thank you very much for your help. I’ll let you know if I have any improvements from my side. Looking forward to hearing from you. Thanks, Monica > On 30 Jun 2023, at 15:08, Matthew Knepley wrote: > > On Fri, Jun 30, 2023 at 8:38 AM Ngoc

Re: [petsc-users] Using PETSc GPU backend

2023-06-30 Thread Mark Adams
PetscCall(PetscInitialize(&argc, &argv, NULL, help)); gives us the args and you run: a.out -mat_type aijcusparse -vec_type cuda -log_view -options_left Mark On Fri, Jun 30, 2023 at 6:16 AM Matthew Knepley wrote: > On Fri, Jun 30, 2023 at 1:13 AM Ng, Cho-Kuen via petsc-users < > petsc-users@mcs

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Barry Smith
I glued your code fragment into a stand-alone program and it runs fine for me on 16 ranks. Does this simple program run for you? program main #include use petsc implicit none integer ierr MPI_Comm comm DM da3d ISLocalToGlobalMapping map PetscInt

Re: [petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Barry Smith
Oh, I forgot to mention you should first check that the PCMG works quite well for the full system (without the PCREDISTRIBUTE); the convergence on the redistributed system (assuming you did all the work to get PCMG to work for you) should be very similar to (but not measurably better) than th

Re: [petsc-users] Using PETSc GPU backend

2023-06-30 Thread Barry Smith
Note that options like -mat_type aijcusparse -vec_type cuda only work if the program is set up to allow runtime swapping of matrix and vector types. If you have a call to MatCreateMPIAIJ() or other specific types then then these options do nothing but because Mark had you use -options_left t

Re: [petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Carl-Johan Thore via petsc-users
Thanks for the quick reply and the suggestions! " ... you should first check that the PCMG works quite well " Yes, the PCMG works very well for the full system. "I am guessing that your code is slightly different than ex42.c because you take the interpolation matrix provided by the DM and give

Re: [petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Matthew Knepley
On Fri, Jun 30, 2023 at 10:16 AM Carl-Johan Thore via petsc-users < petsc-users@mcs.anl.gov> wrote: > Thanks for the quick reply and the suggestions! > > > > “ … you should first check that the PCMG works quite well “ > > > > Yes, the PCMG works very well for the full system. > > > > “I am guessin

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Ngoc Mai Monica Huynh
Hi, yes, it runs and now also my code. I moved it from the extension .F to .F90. (With my older codes the extension .F still works fine, but not with this one) Thank you for the patience and support! Monica > On 30 Jun 2023, at 15:54, Barry Smith wrote: > > > I glued your code fragme

Re: [petsc-users] Fortran alternative for DMDAGetElements?

2023-06-30 Thread Matthew Knepley
On Fri, Jun 30, 2023 at 10:48 AM Ngoc Mai Monica Huynh < ngocmaimonica.hu...@unipv.it> wrote: > Hi, > yes, it runs and now also my code. > I moved it from the extension .F to .F90. > (With my older codes the extension .F still works fine, but not with this > one) > Yes, you need .F90 to properly

Re: [petsc-users] Using PETSc GPU backend

2023-06-30 Thread Ng, Cho-Kuen via petsc-users
Barry, Mark and Matt, Thank you all for the suggestions. I will modify the code so we can pass runtime options. Cho From: Barry Smith Sent: Friday, June 30, 2023 7:01 AM To: Mark Adams Cc: Matthew Knepley ; Ng, Cho-Kuen ; petsc-users@mcs.anl.gov Subject: Re:

Re: [petsc-users] Using PETSc GPU backend

2023-06-30 Thread Ng, Cho-Kuen via petsc-users
Paul, Thank you for your suggestion. I will try different spack install specifications. Cho From: Grosse-Bley, Paul Leonard Sent: Friday, June 30, 2023 4:07 AM To: Ng, Cho-Kuen Cc: petsc-users@mcs.anl.gov Subject: Re: [petsc-users] Using PETSc GPU backend Hi

Re: [petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Barry Smith
> On Jun 30, 2023, at 10:22 AM, Matthew Knepley wrote: > > On Fri, Jun 30, 2023 at 10:16 AM Carl-Johan Thore via petsc-users > mailto:petsc-users@mcs.anl.gov>> wrote: >> Thanks for the quick reply and the suggestions! >> >> >> >> “ … you should first check that the PCMG works quite well “

[petsc-users] Smaller assemble time with increasing processors

2023-06-30 Thread Runfeng Jin
Hello! When I use PETSc build a sbaij matrix, I find a strange thing. When I increase the number of processors, the assemble time become smaller. All these are totally same matrix. The assemble time mainly arouse from message passing, which because I use dynamic workload that it is random for whic

Re: [petsc-users] Smaller assemble time with increasing processors

2023-06-30 Thread Barry Smith
You cannot look just at the VecAssemblyEnd() time, that will very likely give the wrong impression of the total time it takes to put the values in. You need to register a new Event and put a PetscLogEvent() just before you start generating the vector entries and calling VecSetValues() an

Re: [petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Carl-Johan Thore via petsc-users
“Possibly, but if you are doing FD, then there is built-in topology in DMDA that is not present in Plex, so finding the neighbors in the right order is harder (possible, but harder, we address this in some new work that is not yet merged). There is also structured adaptive support with DMForest,

Re: [petsc-users] PCMG with PCREDISTRIBUTE

2023-06-30 Thread Matthew Knepley
On Fri, Jun 30, 2023 at 12:08 PM Carl-Johan Thore wrote: > “Possibly, but if you are doing FD, then there is built-in topology in > DMDA that is not present in Plex, so > > finding the neighbors in the right order is harder (possible, but harder, > we address this in some new work that is not yet

[petsc-users] Inquiry about reading the P2 tetrahedron mesh from GMSH

2023-06-30 Thread neil liu
Dear Petsc developers, I am reading P2 mesh from GMSH. And used DMFieldGetClosure_Internal to check the coordinates for each tetrahedron, It seems reasonable. But when I tried DMGetCoordinates (dm, &global), it seems the vector global is not consistent with the node number, Then what is global he

[petsc-users] Fwd: Smaller assemble time with increasing processors

2023-06-30 Thread Runfeng Jin
Hi, Thanks for your reply. I try to use PetscLogEvent(), and the result shows same conclusion. What I have done is : PetscLogEvent Mat_assemble_event, Mat_setvalue_event, Mat_setAsse_event; PetscClassId classid; PetscLogDouble user_event_flops; PetscClassIdR