Re: [petsc-users] parallel computing error

2023-05-03 Thread Barry Smith
You can configure with MUMPS ./configure --download-mumps --download-scalapack --download-ptscotch --download-metis --download-parmetis And then use MatMatSolve() as in src/mat/tests/ex125.c with parallel MatMatSolve() using MUMPS as the solver. Barry > On May 3, 2023, at 10:29 PM, ­권

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-03 Thread Barry Smith
Do they start very similarly and then slowly drift further apart? That is the first couple of KSP iterations they are almost identical but then for each iteration get a bit further. Similar for the SNES iterations, starting close and then for more iterations and more solves they start moving

Re: [petsc-users] parallel computing error

2023-05-03 Thread ­권승리 / 학생 / 항공우주공학과
Dear developers Thank you for your explanation. But I should use the MatCreateSeqDense because I want to use the MatMatSolve that B matrix must be a SeqDense matrix. Using MatMatSolve is an inevitable part of my code. Could you give me a comment to avoid this error? Best, Seung Lee Kwon 2023

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-03 Thread Mark Lohry
This is on a single MPI rank. I haven't checked the coloring, was just guessing there. But the solutions/residuals are slightly different from run to run. Fair to say that for serial JFNK/asm ilu0/gmres we should expect bitwise identical results? On Wed, May 3, 2023, 8:50 PM Barry Smith wrote:

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-03 Thread Barry Smith
No, the coloring should be identical every time. Do you see differences with 1 MPI rank? (Or much smaller ones?). > On May 3, 2023, at 8:42 PM, Mark Lohry wrote: > > I'm running multiple iterations of newtonls with an MFFD/JFNK nonlinear > solver where I give it the sparsity. PC asm, KSP

[petsc-users] sources of floating point randomness in JFNK in serial

2023-05-03 Thread Mark Lohry
I'm running multiple iterations of newtonls with an MFFD/JFNK nonlinear solver where I give it the sparsity. PC asm, KSP gmres, with SNESSetLagJacobian -2 (compute once and then frozen jacobian). I'm seeing slight (<1%) but nonzero differences in residuals from run to run. I'm wondering where rand

Re: [petsc-users] parallel computing error

2023-05-03 Thread Matthew Knepley
On Wed, May 3, 2023 at 6:05 AM ­권승리 / 학생 / 항공우주공학과 wrote: > Dear developers > > I'm trying to use parallel computing and I ran the command 'mpirun -np 4 > ./app' > > In this case, there are two problems. > > *First,* I encountered error message > /// > [0]PETSC ERROR: [1]PETSC ERROR:

[petsc-users] parallel computing error

2023-05-03 Thread ­권승리 / 학생 / 항공우주공학과
Dear developers I'm trying to use parallel computing and I ran the command 'mpirun -np 4 ./app' In this case, there are two problems. *First,* I encountered error message /// [0]PETSC ERROR: [1]PETSC ERROR: - Error Message -

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-05-03 Thread Sebastian Blauth
First of all, yes you are correct that I am trying to solve the stationary incompressible Navier Stokes equations. On 02.05.2023 21:33, Matthew Knepley wrote: On Tue, May 2, 2023 at 2:29 PM Jed Brown > wrote: Sebastian Blauth mailto:sebastian.bla...@itwm.fraunhofe