[petsc-users] FEM Matrix Assembly of Overlapped Mesh

2023-07-23 Thread 袁煕
Hi, I used PETSc to assemble a FEM stiff matrix of an overlapped (overlap=2) DMPlex and used the MUMPS solver to solve it. But I got different solution by using 1 CPU and MPI parallel computation. I am wondering if I missed some necessary step or setting during my implementation. My calling p

[petsc-users] Matrix assembly problem of overlapped DMPlex

2023-07-23 Thread 袁煕
袁煕 0:50 (0 分前) To PETSc Hi, I used PETSc to assemble a FEM stiff matrix of an overlapped (overlap=2) DMPlex and used the MUMPS solver to solve it. But I got a different solution by using 1 CPU and MPI parallel computation. I am wondering if I missed some necessary step or setting during my implem

Re: [petsc-users] FEM Matrix Assembly of Overlapped Mesh

2023-07-23 Thread Mark Adams
If you want a processor independent solve with mumps use '-pc_type lu' and if you are configured with MUMPS it will give you a parallel LU solve. And don't use any overlap in DM. If want a local LU with global 'asm' or 'bjacobi' then you have an iterative solver and use something like -ksp_type gmr

Re: [petsc-users] Matrix assembly problem of overlapped DMPlex

2023-07-23 Thread Matthew Knepley
On Sun, Jul 23, 2023 at 11:54 AM 袁煕 wrote: > 袁煕 > 0:50 (0 分前) > To PETSc > Hi, > > I used PETSc to assemble a FEM stiff matrix of an overlapped (overlap=2) > DMPlex and used the MUMPS solver to solve it. But I got a > different solution by using 1 CPU and MPI parallel computation. I am > wonderi