Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
On Tue, 23 May 2023 at 20:09, Yann Jobic wrote: > If i may, you can use the command line option "-mat_mumps_icntl_4 2" > MUMPS then gives infomations about the factorization step, such as the > estimated needed memory. > > Thank you for your suggestion! Best wishes, Zongze Best regards, > > Yan

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Yann Jobic
If i may, you can use the command line option "-mat_mumps_icntl_4 2" MUMPS then gives infomations about the factorization step, such as the estimated needed memory. Best regards, Yann Le 5/23/2023 à 11:59 AM, Matthew Knepley a écrit : On Mon, May 22, 2023 at 10:42 PM Zongze Yang

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
On Tue, 23 May 2023 at 19:51, Zongze Yang wrote: > Thank you for your suggestion. I solved the problem with SuperLU_DIST, and > it works well. > This is solved with four nodes, each equipped with 500G of memory. Best wishes, Zongze Best wishes, > Zongze > > > On Tue, 23 May 2023 at 18:00, Mat

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
Thank you for your suggestion. I solved the problem with SuperLU_DIST, and it works well. Best wishes, Zongze On Tue, 23 May 2023 at 18:00, Matthew Knepley wrote: > On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote: > >> I have an additional question to ask: Is it possible for the SuperLU_DI

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Matthew Knepley
On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote: > I have an additional question to ask: Is it possible for the SuperLU_DIST > library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS? > I do not know if they use that function. But it is easy to try it out, so I would. Thank

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Matthew Knepley
On Mon, May 22, 2023 at 10:42 PM Zongze Yang wrote: > On Tue, 23 May 2023 at 05:31, Stefano Zampini > wrote: > >> If I may add to the discussion, it may be that you are going OOM since >> you are trying to factorize a 3 million dofs problem, this problem goes >> undetected and then fails at a la

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
I have an additional question to ask: Is it possible for the SuperLU_DIST library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS? Best wishes, Zongze On Tue, 23 May 2023 at 10:41, Zongze Yang wrote: > On Tue, 23 May 2023 at 05:31, Stefano Zampini > wrote: > >> If I may add to

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
On Tue, 23 May 2023 at 05:31, Stefano Zampini wrote: > If I may add to the discussion, it may be that you are going OOM since you > are trying to factorize a 3 million dofs problem, this problem goes > undetected and then fails at a later stage > Thank you for your comment. I ran the problem wit

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Stefano Zampini
If I may add to the discussion, it may be that you are going OOM since you are trying to factorize a 3 million dofs problem, this problem goes undetected and then fails at a later stage Il giorno lun 22 mag 2023 alle ore 20:03 Zongze Yang ha scritto: > Thanks! > > Zongze > > Matthew Knepley 于202

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
Thanks! Zongze Matthew Knepley 于2023年5月23日 周二00:09写道: > On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote: > >> Hi, >> >> I hope this letter finds you well. I am writing to seek guidance >> regarding an error I encountered while solving a matrix using MUMPS on >> multiple nodes: >> > > Iprobe

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Matthew Knepley
On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote: > Hi, > > I hope this letter finds you well. I am writing to seek guidance regarding > an error I encountered while solving a matrix using MUMPS on multiple nodes: > Iprobe is buggy on several MPI implementations. PETSc has an option for shutti

[petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
Hi, I hope this letter finds you well. I am writing to seek guidance regarding an error I encountered while solving a matrix using MUMPS on multiple nodes: ```bash Abort(1681039) on node 60 (rank 60 in comm 240): Fatal error in PMPI_Iprobe: Other MPI error, error stack: PMPI_Iprobe(124)..