Hello, David,
It took a longer time than I expected to add the CUDA-aware MPI feature in
PETSc. It is now in PETSc-3.12, released last week. I have a little fix after
that, so you better use petsc master. Use petsc option -use_gpu_aware_mpi to
enable it. On Summit, you also need jsrun
From: Matthew Knepley
Date: Friday, October 4, 2019 at 3:19 AM
To: Lawrence Mitchell
Cc: "Salazar De Troya, Miguel" ,
"petsc-users@mcs.anl.gov"
Subject: Re: [petsc-users] Stokes-Brinkmann equation preconditioner
On Fri, Oct 4, 2019 at 6:04 AM Lawrence Mitchell
mailto:we...@gmx.li>> wrote:
You are right. I need to look into iterative solvers. Memory usage of direct
solvers is a problem.
Thanks
Amir
On Oct 7 2019, at 12:19 pm, Smith, Barry F. wrote:
>
> If you need to use a direct solver than you need to start running in parallel
> and using MUMPS or SuperLU_DIST or Pastix as