Re: [petsc-users] Cannot eagerly initialize cuda, as doing so results in cuda error 35 (cudaErrorInsufficientDriver) : CUDA driver version is insufficient for CUDA runtime version

2022-01-19 Thread Jed Brown
Fande Kong writes: > On Wed, Jan 19, 2022 at 11:39 AM Jacob Faibussowitsch > wrote: > >> Are you running on login nodes or compute nodes (I can’t seem to tell from >> the configure.log)? >> > > I was compiling codes on login nodes, and running codes on compute nodes. > Login nodes do not have

[petsc-users] Does mpiaijkok intend to support 64-bit integers?

2022-01-19 Thread Fande Kong
Hi All, It seems that mpiaijkok does not support 64-bit integers at this time. Do we have any motivation for this? Or Is it just a bug? Thanks, Fande petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx(306): error: a value of type "MatColumnIndexType *" cannot be assigned to an entity of

Re: [petsc-users] PETSc MUMPS interface

2022-01-19 Thread Zhang, Hong via petsc-users
Varun, This feature is merged to petsc main https://gitlab.com/petsc/petsc/-/merge_requests/4727 Hong From: petsc-users on behalf of Zhang, Hong via petsc-users Sent: Wednesday, January 19, 2022 9:37 AM To: Varun Hiremath Cc: Peder Jørgensgaard Olesen via

Re: [petsc-users] Cannot eagerly initialize cuda, as doing so results in cuda error 35 (cudaErrorInsufficientDriver) : CUDA driver version is insufficient for CUDA runtime version

2022-01-19 Thread Fande Kong
On Wed, Jan 19, 2022 at 11:39 AM Jacob Faibussowitsch wrote: > Are you running on login nodes or compute nodes (I can’t seem to tell from > the configure.log)? > I was compiling codes on login nodes, and running codes on compute nodes. Login nodes do not have GPUs, but compute nodes do have

Re: [petsc-users] Cannot eagerly initialize cuda, as doing so results in cuda error 35 (cudaErrorInsufficientDriver) : CUDA driver version is insufficient for CUDA runtime version

2022-01-19 Thread Jacob Faibussowitsch
Are you running on login nodes or compute nodes (I can’t seem to tell from the configure.log)? If running from login nodes, do they support running with GPU’s? Some clusters will install stub versions of cuda runtime on login nodes (such that configuration can find them), but that won’t

Re: [petsc-users] Nullspaces

2022-01-19 Thread Mark Adams
On Wed, Jan 19, 2022 at 12:54 PM Marco Cisternino < marco.cistern...@optimad.it> wrote: > Thank you, Matthew. > > I’m going to pay attention to our non-dimensionalization, avoiding > division by cell volume helps a lot. > > > > Sorry, Mark, I cannot get your point: which 1D problem are you

Re: [petsc-users] Nullspaces

2022-01-19 Thread Marco Cisternino
Thank you, Matthew. I’m going to pay attention to our non-dimensionalization, avoiding division by cell volume helps a lot. Sorry, Mark, I cannot get your point: which 1D problem are you referring to? The case I’m talking about is based on a 3D octree mesh. Thank you all for your support.

Re: [petsc-users] Cannot eagerly initialize cuda, as doing so results in cuda error 35 (cudaErrorInsufficientDriver) : CUDA driver version is insufficient for CUDA runtime version

2022-01-19 Thread Jacob Faibussowitsch
Hi Fande, What machine are you running this on? Please attach configure.log so I can troubleshoot this. Best regards, Jacob Faibussowitsch (Jacob Fai - booss - oh - vitch) > On Jan 19, 2022, at 10:04, Fande Kong wrote: > > Hi All, > > Upgraded PETSc from 3.16.1 to the current main branch.

[petsc-users] Cannot eagerly initialize cuda, as doing so results in cuda error 35 (cudaErrorInsufficientDriver) : CUDA driver version is insufficient for CUDA runtime version

2022-01-19 Thread Fande Kong
Hi All, Upgraded PETSc from 3.16.1 to the current main branch. I suddenly got the following error message: 2d_diffusion]$ ../../../moose_test-dbg -i 2d_diffusion_test.i -use_gpu_aware_mpi 0 -gpu_mat_type aijcusparse -gpu_vec_type cuda -log_view [0]PETSC ERROR: - Error

Re: [petsc-users] PETSc MUMPS interface

2022-01-19 Thread Zhang, Hong via petsc-users
Varun, Good to know it works. FactorSymbolic function is still being called twice, but the 2nd call is a no-op, thus it still appears in '-log_view'. I made changes in the low level of mumps routine, not within PCSetUp() because I feel your use case is limited to mumps, not other matrix package

Re: [petsc-users] Downloaded superlu_dist could not be used. Please check install in $PREFIX

2022-01-19 Thread Fande Kong
Thanks, Sherry, and Satish, I will try your suggestion, and report back to you as soon as possible. Thanks, Fande On Tue, Jan 18, 2022 at 10:48 PM Satish Balay wrote: > Sherry, > > This is with superlu-dist-7.1.1 [not master branch] > > > Fande, > > >> > Executing: mpifort -o

Re: [petsc-users] Nullspaces

2022-01-19 Thread Mark Adams
ILU is LU for this 1D problem and its singular. ILU might have some logic to deal with a singular system. Not sure. LU should fail. You might try -ksp_type preonly and -pc_type lu And -pc_type jacobi should not have any numerical problems. Try that. On Wed, Jan 19, 2022 at 7:19 AM Matthew Knepley

Re: [petsc-users] Nullspaces

2022-01-19 Thread Matthew Knepley
On Wed, Jan 19, 2022 at 4:52 AM Marco Cisternino < marco.cistern...@optimad.it> wrote: > Thank you Matthew. > > But I cannot get the point. I got the point about the test but to try to > explain my doubt I’m going to prepare another toy code. > > > > By words… > > I usually have a finite volume

Re: [petsc-users] Nullspaces

2022-01-19 Thread Marco Cisternino
Thank you Matthew. But I cannot get the point. I got the point about the test but to try to explain my doubt I’m going to prepare another toy code. By words… I usually have a finite volume discretization of the Laplace operator with homogeneous Neumann BC on an octree mesh and it reads Aij * xj

Re: [petsc-users] PETSc MUMPS interface

2022-01-19 Thread Varun Hiremath
Hi Hong, Thanks, I tested your branch and I think it is working fine. I don't see any increase in runtime, however with -log_view I see that the MatLUFactorSymbolic function is still being called twice, so is this expected? Is the second call a no-op? $ ./ex52.o -use_mumps_lu -print_mumps_memory