Hi all,

I am trying to accelerate the linear solver with PETSc GPU backend. For
testing I have a simple 1D heat diffusion solver, here are some
observations.
1. If I use -pc_type gamg it throws the following error
 ** On entry to cusparseCreateCsr() parameter number 5 (csrRowOffsets) had
an illegal value: NULL pointer

[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: GPU error
[0]PETSC ERROR: cuSPARSE errorcode 3 (CUSPARSE_STATUS_INVALID_VALUE) :
invalid value
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.19.4-959-g92f1e92e88
 GIT Date: 2023-08-13 19:43:04 +0000

2. Default pc ilu takes about 1.2 seconds on a single CPU and it takes
about 105.9 seconds on a GPU. Similar observations with pc_type asm
I have NVIDIA RTX A2000 8GB Laptop GPU

3. What I could be missing? Also, are there any general guidelines for
better GPU performance using PETSc?

Regards,
Maruthi

Reply via email to