On Tue, 23 May 2023 at 20:09, Yann Jobic wrote:
> If i may, you can use the command line option "-mat_mumps_icntl_4 2"
> MUMPS then gives infomations about the factorization step, such as the
> estimated needed memory.
>
> Thank you for your suggestion!
Best wishes,
Zongze
Best regards,
>
>
If i may, you can use the command line option "-mat_mumps_icntl_4 2"
MUMPS then gives infomations about the factorization step, such as the
estimated needed memory.
Best regards,
Yann
Le 5/23/2023 à 11:59 AM, Matthew Knepley a écrit :
On Mon, May 22, 2023 at 10:42 PM Zongze Yang
On Tue, 23 May 2023 at 19:51, Zongze Yang wrote:
> Thank you for your suggestion. I solved the problem with SuperLU_DIST, and
> it works well.
>
This is solved with four nodes, each equipped with 500G of memory.
Best wishes,
Zongze
Best wishes,
> Zongze
>
>
> On Tue, 23 May 2023 at 18:00,
Thank you for your suggestion. I solved the problem with SuperLU_DIST, and
it works well.
Best wishes,
Zongze
On Tue, 23 May 2023 at 18:00, Matthew Knepley wrote:
> On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote:
>
>> I have an additional question to ask: Is it possible for the
On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote:
> I have an additional question to ask: Is it possible for the SuperLU_DIST
> library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS?
>
I do not know if they use that function. But it is easy to try it out, so I
would.
On Mon, May 22, 2023 at 10:42 PM Zongze Yang wrote:
> On Tue, 23 May 2023 at 05:31, Stefano Zampini
> wrote:
>
>> If I may add to the discussion, it may be that you are going OOM since
>> you are trying to factorize a 3 million dofs problem, this problem goes
>> undetected and then fails at a
I have an additional question to ask: Is it possible for the SuperLU_DIST
library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS?
Best wishes,
Zongze
On Tue, 23 May 2023 at 10:41, Zongze Yang wrote:
> On Tue, 23 May 2023 at 05:31, Stefano Zampini
> wrote:
>
>> If I may add
On Tue, 23 May 2023 at 05:31, Stefano Zampini
wrote:
> If I may add to the discussion, it may be that you are going OOM since you
> are trying to factorize a 3 million dofs problem, this problem goes
> undetected and then fails at a later stage
>
Thank you for your comment. I ran the problem
If I may add to the discussion, it may be that you are going OOM since you
are trying to factorize a 3 million dofs problem, this problem goes
undetected and then fails at a later stage
Il giorno lun 22 mag 2023 alle ore 20:03 Zongze Yang
ha scritto:
> Thanks!
>
> Zongze
>
> Matthew Knepley
Thanks!
Zongze
Matthew Knepley 于2023年5月23日 周二00:09写道:
> On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote:
>
>> Hi,
>>
>> I hope this letter finds you well. I am writing to seek guidance
>> regarding an error I encountered while solving a matrix using MUMPS on
>> multiple nodes:
>>
>
> Iprobe
On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote:
> Hi,
>
> I hope this letter finds you well. I am writing to seek guidance regarding
> an error I encountered while solving a matrix using MUMPS on multiple nodes:
>
Iprobe is buggy on several MPI implementations. PETSc has an option for
Hi,
I hope this letter finds you well. I am writing to seek guidance regarding
an error I encountered while solving a matrix using MUMPS on multiple nodes:
```bash
Abort(1681039) on node 60 (rank 60 in comm 240): Fatal error in
PMPI_Iprobe: Other MPI error, error stack:
12 matches
Mail list logo