Re: [petsc-users] DMLocalToLocal with DMPlex in Fortran

2022-10-10 Thread Mike Michell
Hi, I was wondering if there is any comment on the example file that I can refer to. Thanks, Mike > Thank you for the reply. > Sure, a short example code is attached here with a square box mesh and a > run script. > Inside the source, you may find two versions of halo exchange; one is for >

Re: [petsc-users] MSPIN

2022-10-10 Thread Matthew Knepley
On Mon, Oct 10, 2022 at 5:37 PM Alexander Lindsay wrote: > I know that PETSc has native support for ASPIN. Has anyone tried MSPIN? I > wouldn't be surprised if someone has implemented it in user code. Wondering > what the barriers would be to creating an option like `-snes_type mspin` ? > David

[petsc-users] MSPIN

2022-10-10 Thread Alexander Lindsay
I know that PETSc has native support for ASPIN. Has anyone tried MSPIN? I wouldn't be surprised if someone has implemented it in user code. Wondering what the barriers would be to creating an option like `-snes_type mspin` ?

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-10-10 Thread Matthew Knepley
On Mon, Oct 10, 2022 at 11:42 AM feng wang wrote: > Hi Mat, > > Thanks for your reply. It seems I have to use "VecSetValues" to assign the > values of my ghost vector "petsc_dcsv". and then call VecAssemblyBegin/End. > If I do it this way, the ghost cells are exchanged correctly. > This should

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-10-10 Thread Jose E. Roman
> El 10 oct 2022, a las 17:42, feng wang escribió: > > Hi Mat, > > Thanks for your reply. It seems I have to use "VecSetValues" to assign the > values of my ghost vector "petsc_dcsv". and then call VecAssemblyBegin/End. > If I do it this way, the ghost cells are exchanged correctly. > >

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-10-10 Thread feng wang
Hi Mat, Thanks for your reply. It seems I have to use "VecSetValues" to assign the values of my ghost vector "petsc_dcsv". and then call VecAssemblyBegin/End. If I do it this way, the ghost cells are exchanged correctly. Besides, I notice that, when I run my code sequentially or with multiple

Re: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check

2022-10-10 Thread Junchao Zhang
On Mon, Oct 10, 2022 at 8:13 AM Rob Kudyba wrote: > OK, let's walk back and don't use -DCMAKE_C_COMPILER=/path/to/mpicc >> > Will do > > >> libompitrace.so.40.30.0 is not the OpenMP library; it is the tracing >> library for OpenMPI, https://github.com/open-mpi/ompi/issues/10036 >> > Does that

Re: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check

2022-10-10 Thread Rob Kudyba
OK so I missed the OpenMP vs OpenMPI with incorrectly setting -DOpenMP_libomp_LIBRARY="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib//libompitrace.so.40.30.0 So I changed it to point to /cm/local/apps/gcc/10.2.0/lib/libgomp.so.1.0.0 -- Found PETSc 3.18.0 CMake Error at

Re: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check

2022-10-10 Thread Rob Kudyba
> > OK, let's walk back and don't use -DCMAKE_C_COMPILER=/path/to/mpicc > Will do > libompitrace.so.40.30.0 is not the OpenMP library; it is the tracing > library for OpenMPI, https://github.com/open-mpi/ompi/issues/10036 > Does that mean I should remove this option in the cmake command? > In