[petsc-users] DMLabel to extract height-0 points by their DMPolytope value

2023-05-23 Thread Ferrand, Jesus A.
Dear PETSc team: I am trying to use DMPlex and DMLabel to develop an API to write plexes to .cgns format in parallel. To that end, I need a way to extract the height-0 points and sort them by topological type (i.e., chunk of tetrahedra, followed by chunk of pyramids, etc.). I figured I could us

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
On Tue, 23 May 2023 at 20:09, Yann Jobic wrote: > If i may, you can use the command line option "-mat_mumps_icntl_4 2" > MUMPS then gives infomations about the factorization step, such as the > estimated needed memory. > > Thank you for your suggestion! Best wishes, Zongze Best regards, > > Yan

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Yann Jobic
If i may, you can use the command line option "-mat_mumps_icntl_4 2" MUMPS then gives infomations about the factorization step, such as the estimated needed memory. Best regards, Yann Le 5/23/2023 à 11:59 AM, Matthew Knepley a écrit : On Mon, May 22, 2023 at 10:42 PM Zongze Yang

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
On Tue, 23 May 2023 at 19:51, Zongze Yang wrote: > Thank you for your suggestion. I solved the problem with SuperLU_DIST, and > it works well. > This is solved with four nodes, each equipped with 500G of memory. Best wishes, Zongze Best wishes, > Zongze > > > On Tue, 23 May 2023 at 18:00, Mat

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
Thank you for your suggestion. I solved the problem with SuperLU_DIST, and it works well. Best wishes, Zongze On Tue, 23 May 2023 at 18:00, Matthew Knepley wrote: > On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote: > >> I have an additional question to ask: Is it possible for the SuperLU_DI

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Matthew Knepley
On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote: > I have an additional question to ask: Is it possible for the SuperLU_DIST > library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS? > I do not know if they use that function. But it is easy to try it out, so I would. Thank

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Matthew Knepley
On Mon, May 22, 2023 at 10:42 PM Zongze Yang wrote: > On Tue, 23 May 2023 at 05:31, Stefano Zampini > wrote: > >> If I may add to the discussion, it may be that you are going OOM since >> you are trying to factorize a 3 million dofs problem, this problem goes >> undetected and then fails at a la