Re: [petsc-users] About PC hpddm

2024-05-09 Thread Ng, Cho-Kuen via petsc-users
] About PC hpddm Try: spack install slepc+hpddm ^petsc+hpddm Satish On Thu, 9 May 2024, Ng, Cho-Kuen via petsc-users wrote: > Pierre, > > petec and slepc libraries are found in the spack directory, but > libhpddm_petsc is not. So it is not built during the spack ins

Re: [petsc-users] About PC hpddm

2024-05-09 Thread Ng, Cho-Kuen via petsc-users
From: Pierre Jolivet Sent: Wednesday, May 8, 2024 11:01 PM To: Ng, Cho-Kuen Cc: petsc-users@mcs.anl.gov Subject: Re: [petsc-users] About PC hpddm On 9 May 2024, at 6:31 AM, Ng, Cho-Kuen via petsc-users wrote: This Message Is From an External Sender This message came from

[petsc-users] About PC hpddm

2024-05-08 Thread Ng, Cho-Kuen via petsc-users
I used spack to install petsc with hpddm as follows. o spack install petsc+hpddm o spack install slepc ^petsc+hpddm I got the following runtime error of finding the library hpddm_petsc. argv0 -ksp_converged_reason -ksp_view_final_residual -ksp_type gmres -pc_type hpddm -ksp_monitor -log_view

Re: [petsc-users] Using PETSc GPU backend

2024-04-13 Thread Ng, Cho-Kuen via petsc-users
- frequency Helmholtz is very hard, low-frequency is doable by using a larger coarse grid (you have a tiny coarse grid). On Sat, Apr 13, 2024 at 3:04 PM Matthew Knepley mailto:knep...@gmail.com>> wrote: On Fri, Apr 12, 2024 at 8:19 PM Ng, Cho-Kuen via petsc-users mailto:petsc-users@mcs.anl.gov&g

Re: [petsc-users] Using PETSc GPU backend

2024-04-12 Thread Ng, Cho-Kuen via petsc-users
; Subject: Re: [petsc-users] Using PETSc GPU backend > Can petsc show the number of GPUs used? -device_view Best regards, Jacob Faibussowitsch (Jacob Fai - booss - oh - vitch) > On Aug 12, 2023, at 00:53, Ng, Cho-Kuen via petsc-users > mailto:petsc-users@mcs.anl.gov>> wrote: >

Re: [petsc-users] Using PETSc GPU backend

2024-04-12 Thread Ng, Cho-Kuen via petsc-users
est regards, Jacob Faibussowitsch (Jacob Fai - booss - oh - vitch) > On Aug 12, 2023, at 00:53, Ng, Cho-Kuen via petsc-users > wrote: > > Barry, > > I tried again today on Perlmutter and running on multiple GPU nodes worked. > Likely, I had messed up something the oth

Re: [petsc-users] Using PETSc GPU backend

2023-08-12 Thread Ng, Cho-Kuen via petsc-users
cob Faibussowitsch (Jacob Fai - booss - oh - vitch) > On Aug 12, 2023, at 00:53, Ng, Cho-Kuen via petsc-users > wrote: > > Barry, > > I tried again today on Perlmutter and running on multiple GPU nodes worked. > Likely, I had messed up something the other day. Also, I was able

Re: [petsc-users] Using PETSc GPU backend

2023-08-11 Thread Ng, Cho-Kuen via petsc-users
use -options_left the program will tell you at the end that it did not use the option so you will know. On Jun 30, 2023, at 9:30 AM, Mark Adams mailto:mfad...@lbl.gov>> wrote: PetscCall(PetscInitialize(, , NULL, help)); gives us the args and you run: a.out -mat_type aijcusparse

Re: [petsc-users] Using PETSc GPU backend

2023-08-09 Thread Ng, Cho-Kuen via petsc-users
t 9:30 AM, Mark Adams mailto:mfad...@lbl.gov>> wrote: PetscCall(PetscInitialize(, , NULL, help)); gives us the args and you run: a.out -mat_type aijcusparse -vec_type cuda -log_view -options_left Mark On Fri, Jun 30, 2023 at 6:16 AM Matthew Knepley mailto:knep...@gmail.com>> wrote:

Re: [petsc-users] Using PETSc GPU backend

2023-07-16 Thread Ng, Cho-Kuen via petsc-users
ell you at the end that it did not use the option so you will know. On Jun 30, 2023, at 9:30 AM, Mark Adams mailto:mfad...@lbl.gov>> wrote: PetscCall(PetscInitialize(, , NULL, help)); gives us the args and you run: a.out -mat_type aijcusparse -vec_type cuda -log_view -options_left Mar

Re: [petsc-users] Using PETSc GPU backend

2023-07-14 Thread Ng, Cho-Kuen via petsc-users
nd you run: a.out -mat_type aijcusparse -vec_type cuda -log_view -options_left Mark On Fri, Jun 30, 2023 at 6:16 AM Matthew Knepley mailto:knep...@gmail.com>> wrote: On Fri, Jun 30, 2023 at 1:13 AM Ng, Cho-Kuen via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: Mark, The app

Re: [petsc-users] Using PETSc GPU backend

2023-07-14 Thread Ng, Cho-Kuen via petsc-users
Mark On Fri, Jun 30, 2023 at 6:16 AM Matthew Knepley mailto:knep...@gmail.com>> wrote: On Fri, Jun 30, 2023 at 1:13 AM Ng, Cho-Kuen via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: Mark, The application code reads in parameters from an input file, where we can put the PETSc run

Re: [petsc-users] Using PETSc GPU backend

2023-06-30 Thread Ng, Cho-Kuen via petsc-users
Paul, Thank you for your suggestion. I will try different spack install specifications. Cho From: Grosse-Bley, Paul Leonard Sent: Friday, June 30, 2023 4:07 AM To: Ng, Cho-Kuen Cc: petsc-users@mcs.anl.gov Subject: Re: [petsc-users] Using PETSc GPU backend Hi

Re: [petsc-users] Using PETSc GPU backend

2023-06-30 Thread Ng, Cho-Kuen via petsc-users
-log_view -options_left Mark On Fri, Jun 30, 2023 at 6:16 AM Matthew Knepley mailto:knep...@gmail.com>> wrote: On Fri, Jun 30, 2023 at 1:13 AM Ng, Cho-Kuen via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: Mark, The application code reads in parameters from an input file, wher

Re: [petsc-users] Using PETSc GPU backend

2023-06-29 Thread Ng, Cho-Kuen via petsc-users
of the performance data (from -log_view) will be the percent flops on the GPU. Check that that is > 0. The end of the output will list the options that were used and options that were _not_ used (if any). Check that there are no options left. Mark On Thu, Jun 29, 2023 at 7:50 PM Ng, Cho-Kuen via petsc-us

Re: [petsc-users] Using PETSc GPU backend

2023-06-29 Thread Ng, Cho-Kuen via petsc-users
that that is > 0. The end of the output will list the options that were used and options that were _not_ used (if any). Check that there are no options left. Mark On Thu, Jun 29, 2023 at 7:50 PM Ng, Cho-Kuen via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: I installed PETSc on Perlmut

[petsc-users] Using PETSc GPU backend

2023-06-29 Thread Ng, Cho-Kuen via petsc-users
I installed PETSc on Perlmutter using "spack install petsc+cuda+zoltan" and used it by "spack load petsc/fwge6pf". Then I compiled the application code (purely CPU code) linking to the petsc package, hoping that I can get performance improvement using the petsc GPU backend. However, the timing