Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
I have an additional question to ask: Is it possible for the SuperLU_DIST library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS? Best wishes, Zongze On Tue, 23 May 2023 at 10:41, Zongze Yang wrote: > On Tue, 23 May 2023 at 05:31, Stefano Zampini > wrote: > >> If I may add

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
On Tue, 23 May 2023 at 05:31, Stefano Zampini wrote: > If I may add to the discussion, it may be that you are going OOM since you > are trying to factorize a 3 million dofs problem, this problem goes > undetected and then fails at a later stage > Thank you for your comment. I ran the problem

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Stefano Zampini
If I may add to the discussion, it may be that you are going OOM since you are trying to factorize a 3 million dofs problem, this problem goes undetected and then fails at a later stage Il giorno lun 22 mag 2023 alle ore 20:03 Zongze Yang ha scritto: > Thanks! > > Zongze > > Matthew Knepley

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
Thanks! Zongze Matthew Knepley 于2023年5月23日 周二00:09写道: > On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote: > >> Hi, >> >> I hope this letter finds you well. I am writing to seek guidance >> regarding an error I encountered while solving a matrix using MUMPS on >> multiple nodes: >> > > Iprobe

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Matthew Knepley
On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote: > Hi, > > I hope this letter finds you well. I am writing to seek guidance regarding > an error I encountered while solving a matrix using MUMPS on multiple nodes: > Iprobe is buggy on several MPI implementations. PETSc has an option for

[petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
Hi, I hope this letter finds you well. I am writing to seek guidance regarding an error I encountered while solving a matrix using MUMPS on multiple nodes: ```bash Abort(1681039) on node 60 (rank 60 in comm 240): Fatal error in PMPI_Iprobe: Other MPI error, error stack:

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-22 Thread Matthew Knepley
On Mon, May 22, 2023 at 4:41 AM Berend van Wachem wrote: > Dear Matt, > > I'm really sorry for this stupid bug! > No problem. You have really helped me get the bugs out of Plex. Thanks, Matt > I can confirm that setting the coordinates with both > CellCoordinatesLocal and

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-22 Thread Berend van Wachem
Dear Matt, I'm really sorry for this stupid bug! I can confirm that setting the coordinates with both CellCoordinatesLocal and CoordinatesLocal works. Best regards, Berend. Many thanks and best regards, Berend. On 5/17/23 23:04, Matthew Knepley wrote: On Wed, May 17, 2023 at 2:01 PM Berend