Hi Jose,
I implemented the LU factorized preconditioner and tested it using
PREONLY + LU, but that actually is converging to the wrong eigenvalues,
compared to just using BICGS + BJACOBI, or simply computing
EPS_SMALLEST_MAGNITUDE without any preconditioning. My preconditioning
matrix is only a
Do you have petsc built with superlu_dist?
Satish
On Mon, 27 Sep 2021, Yiyang Li wrote:
> Hello,
>
> I have CUDA aware MPI, and I have upgraded from PETSc 3.12 to PETSc 3.15.4
> and petsc4py 3.15.4.
>
> Now, when I call
>
> PETSc.KSP().solve(..., ...)
>
> The information of GPU is always
Hello,
I have CUDA aware MPI, and I have upgraded from PETSc 3.12 to PETSc 3.15.4
and petsc4py 3.15.4.
Now, when I call
PETSc.KSP().solve(..., ...)
The information of GPU is always printed to stdout by every MPI rank, like
CUDA version: v 11040
CUDA Devices:
0 : Quadro P4000 6 1
Global
Nathan,
Yes, you can call PetscInitializeFortran() from your Fortran library.
Barry
> On Sep 27, 2021, at 11:59 AM, WUKIE, NATHAN A DR-02 USAF AFMC AFRL/RQVC via
> petsc-users wrote:
>
> How should petsc initialization be handled for a python application utilizing
> petsc4py and a
How should petsc initialization be handled for a python application utilizing
petsc4py and a Fortran library application also using petsc?
The petsc documentation states that PetscInitializeFortran "should be called
soon AFTER the call to