Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Xiaoye S. Li
All the OpenMP calls are surrounded by #ifdef _OPENMP ... #endif You can disable openmp during Cmake installation, with the following: -Denable_openmp=FALSE (the default is true) (I think Satish knows how to do this with PETSc installation) --- The reason to use mixed MPI & OpenMP is ma

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
On Thu, Jan 21, 2021 at 10:16 PM Barry Smith wrote: > > > On Jan 21, 2021, at 9:11 PM, Mark Adams wrote: > > I have tried it and it hangs, but that is expected. This is not something > she has prepared for. > > I am working with Sherry on it. > > And she is fine with just one thread and suggests

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Barry Smith
> On Jan 21, 2021, at 9:11 PM, Mark Adams wrote: > > I have tried it and it hangs, but that is expected. This is not something she > has prepared for. > > I am working with Sherry on it. > > And she is fine with just one thread and suggests it if she is in a thread. > > Now that I think ab

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
I have tried it and it hangs, but that is expected. This is not something she has prepared for. I am working with Sherry on it. And she is fine with just one thread and suggests it if she is in a thread. Now that I think about it, I don't understand why she needs OpenMP if she can live with OMP_

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Barry Smith
> On Jan 21, 2021, at 5:37 PM, Mark Adams wrote: > > This did not work. I verified that MPI_Init_thread is being called correctly > and that MPI returns that it supports this highest level of thread safety. > > I am going to ask ORNL. > > And if I use: > > -fieldsplit_i1_ksp_norm_type none

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
This did not work. I verified that MPI_Init_thread is being called correctly and that MPI returns that it supports this highest level of thread safety. I am going to ask ORNL. And if I use: -fieldsplit_i1_ksp_norm_type none -fieldsplit_i1_ksp_max_it 300 for all 9 "i" variables, I can run normal

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
OK, the problem is probably: PetscMPIInt PETSC_MPI_THREAD_REQUIRED = MPI_THREAD_FUNNELED; There is an example that sets: PETSC_MPI_THREAD_REQUIRED = MPI_THREAD_MULTIPLE; This is what I need. On Thu, Jan 21, 2021 at 2:26 PM Mark Adams wrote: > > > On Thu, Jan 21, 2021 at 2:11 PM Matthew Kn

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
On Thu, Jan 21, 2021 at 2:11 PM Matthew Knepley wrote: > On Thu, Jan 21, 2021 at 2:02 PM Mark Adams wrote: > >> On Thu, Jan 21, 2021 at 1:44 PM Matthew Knepley >> wrote: >> >>> On Thu, Jan 21, 2021 at 11:16 AM Mark Adams wrote: >>> Yes, the problem is that each KSP solver is running in an

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Matthew Knepley
On Thu, Jan 21, 2021 at 2:02 PM Mark Adams wrote: > On Thu, Jan 21, 2021 at 1:44 PM Matthew Knepley wrote: > >> On Thu, Jan 21, 2021 at 11:16 AM Mark Adams wrote: >> >>> Yes, the problem is that each KSP solver is running in an OMP thread (So >>> at this point it only works for SELF and its Lan

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
On Thu, Jan 21, 2021 at 1:44 PM Matthew Knepley wrote: > On Thu, Jan 21, 2021 at 11:16 AM Mark Adams wrote: > >> Yes, the problem is that each KSP solver is running in an OMP thread (So >> at this point it only works for SELF and its Landau so it is all I need). >> It looks like MPI reductions c

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Matthew Knepley
On Thu, Jan 21, 2021 at 11:16 AM Mark Adams wrote: > Yes, the problem is that each KSP solver is running in an OMP thread (So > at this point it only works for SELF and its Landau so it is all I need). > It looks like MPI reductions called with a comm_self are not thread safe > (eg, the could say

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
And I guess I am really doing two things here. 1) The solver that I am intending to use is SuperLU. I believe Barry got LU working in OMP threads a few years ago. My problems now are in Krylov. I could live with what I have now and just get Sherry to make SuperLU_dist not use MPI in serial. SuperL

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
On Thu, Jan 21, 2021 at 11:25 AM Jed Brown wrote: > Mark Adams writes: > > > Yes, the problem is that each KSP solver is running in an OMP thread > > There can be more or less splits than OMP_NUM_THREADS. Each thread is > still calling blocking operations. > > This is a concurrency problem, not

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Jed Brown
Mark Adams writes: > Yes, the problem is that each KSP solver is running in an OMP thread There can be more or less splits than OMP_NUM_THREADS. Each thread is still calling blocking operations. This is a concurrency problem, not a parallel efficiency problem. It can be solved with async inte

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
Yes, the problem is that each KSP solver is running in an OMP thread (So at this point it only works for SELF and its Landau so it is all I need). It looks like MPI reductions called with a comm_self are not thread safe (eg, the could say, this is one proc, thus, just copy send --> recv, but they d

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Matthew Knepley
On Thu, Jan 21, 2021 at 10:34 AM Mark Adams wrote: > It looks like PETSc is just too clever for me. I am trying to get a > different MPI_Comm into each block, but PETSc is thwarting me: > It looks like you are using SELF. Is that what you want? Do you want a bunch of comms with the same group, b

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
It looks like PETSc is just too clever for me. I am trying to get a different MPI_Comm into each block, but PETSc is thwarting me: if (jac->use_openmp) { ierr = KSPCreate(MPI_COMM_SELF,&ilink->ksp);CHKERRQ(ierr); PetscPrintf(PETSC_COMM_SELF,"In PCFieldSplitSetFields_FieldSplit with

Re: [petsc-dev] Memory problem with OpenMP and Fieldsplit sub solvers

2021-01-21 Thread Mark Adams
On Wed, Jan 20, 2021 at 6:21 PM Barry Smith wrote: > > > On Jan 20, 2021, at 3:09 PM, Mark Adams wrote: > > So I put in a temporary hack to get the first Fieldsplit apply to NOT use > OMP and it sort of works. > > Preonly/lu is fine. GMRES calls vector creates/dups in every solve so that > is a

[petsc-dev] obscure changes in TSGetStages_Theta

2021-01-21 Thread Stefano Zampini
Hong, I do not understand why you changed the behavior of TSGetStages_Theta https://gitlab.com/petsc/petsc/-/merge_requests/3500/diffs#a582bbaec75f4ae14bbf97d1d0404073ca89ff09_1194_1209 with this MR https://gitlab.com/petsc/petsc/-/merge_requests/3500 Now, the non-endpoint variant does no longer