Re: [petsc-users] [SLEPc] Performance of Krylov-Schur with MUMPS-based shift-and-invert

2018-02-28 Thread Jose E. Roman
Balancing may reduce the norm of the matrix or the condition number of some eigenvalues. It applies to the ST operator, (A-sigma*B)^{-1}*B in case of shift-and-invert. The case that sigma is close to an eigenvalue is usually not a problem, provided that you use a robust direct solver (MUMPS). In

[petsc-users] Multi-preconditioned Krylov

2018-02-28 Thread
Dear PETSc team, I would like to experiment multi-preconditioned Krylov methods, as presented in the paper or Bridson and Greif ( https://www.cs.ubc.ca/~rbridson/mpcg/) and more specifically in a context of DD like in the paper of Spillane ( https://hal.archives-ouvertes.fr/hal-01170059/document)

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Matthew Knepley
On Wed, Feb 28, 2018 at 12:39 AM, Smith, Barry F. wrote: > > Matt, > > I have confirmed this is reproducible and a bug. The problem arises > because > > frame #0: 0x00010140625a libpetsc.3.8.dylib` > PetscViewerVTKAddField_VTK(viewer=0x7fe66760c750, > dm=0x7fe668810820, > PetscVi

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Danyang Su
Hi Matt, Thanks for your suggestion and I will use xmf instead. Regards, Danyang On February 28, 2018 3:58:08 AM PST, Matthew Knepley wrote: >On Wed, Feb 28, 2018 at 12:39 AM, Smith, Barry F. >wrote: > >> >> Matt, >> >> I have confirmed this is reproducible and a bug. The problem arises

[petsc-users] Scaling problem when cores > 600

2018-02-28 Thread TAY wee-beng
Hi, I have a CFD code which uses PETSc and HYPRE. I found that for a certain case with grid size of 192,570,048, I encounter scaling problem when my cores > 600. At 600 cores, the code took 10min for 100 time steps. At 960, 1440 and 2880 cores, it still takes around 10min. At 360 cores, it to

Re: [petsc-users] Scaling problem when cores > 600

2018-02-28 Thread Matthew Knepley
On Wed, Feb 28, 2018 at 10:45 AM, TAY wee-beng wrote: > Hi, > > I have a CFD code which uses PETSc and HYPRE. I found that for a certain > case with grid size of 192,570,048, I encounter scaling problem when my > cores > 600. At 600 cores, the code took 10min for 100 time steps. At 960, > 1440 an

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Smith, Barry F.
It turns out the fix is really easy. Here is a patch. Apply it with patch -p1 < barry-vtk.patch then do make gnumake all in $PETSC_DIR > On Feb 28, 2018, at 9:07 AM, Danyang Su wrote: > > Hi Matt, > > Thanks for your suggestion and I will use xmf instead. > > Regards, > >

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Danyang Su
Hi Barry and Matt, Thanks for your quick response. Considering the output performance, as well as the long-term plan of PETSc development, which format would you suggest? I personally prefer the data format that can be post-processed by Paraview as our sequential code (written without PETSc) i

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Matthew Knepley
On Wed, Feb 28, 2018 at 11:59 AM, Danyang Su wrote: > Hi Barry and Matt, > > Thanks for your quick response. Considering the output performance, as > well as the long-term plan of PETSc development, which format would you > suggest? I personally prefer the data format that can be post-processed b

Re: [petsc-users] Malloc error with 'correct' preallocation?

2018-02-28 Thread Thibaut Appel
Good afternoon, It looks like, after further investigation, that I wasn't filling the diagonal element on some rows and hence not allocating for it - which triggers an error as you must leave room and set the diagonal entry even it is zero according to the MatMPIAIJSetPreallocation documentati

Re: [petsc-users] Multi-preconditioned Krylov

2018-02-28 Thread Smith, Barry F.
Tried to read the papers, couldn't follow, them but my guess is you need to copy the PETSc CG KSP routines and rework them as a new KSP type for these algorithms. Barry > On Feb 28, 2018, at 3:39 AM, Karin&NiKo wrote: > > Dear PETSc team, > > I would like to experiment multi-preco

Re: [petsc-users] Multi-preconditioned Krylov

2018-02-28 Thread Jed Brown
Nicole has expressed interested in helping with a PETSc implementation, she just doesn't have much experience with PETSc yet and I haven't had time to prioritize doing it myself. If you want to develop a PETSc implementation, I would suggest reaching out to her and Cc'ing me. "Smith, Barry F." w

Re: [petsc-users] Multi-preconditioned Krylov

2018-02-28 Thread Matthew Knepley
If you do this, please please please also update NGMRES since it looks like you would do a similar linear least squares thing with the directions. It should be factored out. SLEPc has a nice TSQR that we could pull into PETSc for this. Maty On Feb 28, 2018 14:36, "Jed Brown" wrote: > Nicole

Re: [petsc-users] Scaling problem when cores > 600

2018-02-28 Thread TAY wee-beng
On 1/3/2018 12:10 AM, Matthew Knepley wrote: On Wed, Feb 28, 2018 at 10:45 AM, TAY wee-beng > wrote: Hi, I have a CFD code which uses PETSc and HYPRE. I found that for a certain case with grid size of 192,570,048, I encounter scaling problem when my cor

Re: [petsc-users] Scaling problem when cores > 600

2018-02-28 Thread Matthew Knepley
On Wed, Feb 28, 2018 at 9:01 PM, TAY wee-beng wrote: > > On 1/3/2018 12:10 AM, Matthew Knepley wrote: > > On Wed, Feb 28, 2018 at 10:45 AM, TAY wee-beng wrote: > >> Hi, >> >> I have a CFD code which uses PETSc and HYPRE. I found that for a certain >> case with grid size of 192,570,048, I encount

Re: [petsc-users] Scaling problem when cores > 600

2018-02-28 Thread Mark Adams
> > > Or do I have to use KSPBCGS or KSPGMRES, which is directly from PETSc? > However, I ran KSPGMRES yesterday with the Poisson eqn and my ans didn't > converge. > As Matt said GMRES is not great for symmetric operators like Poisson and you can use CG for the KSP method. HYPRE and GAMG are both

Re: [petsc-users] Scaling problem when cores > 600

2018-02-28 Thread Smith, Barry F.
> On Feb 28, 2018, at 8:01 PM, TAY wee-beng wrote: > > > On 1/3/2018 12:10 AM, Matthew Knepley wrote: >> On Wed, Feb 28, 2018 at 10:45 AM, TAY wee-beng wrote: >> Hi, >> >> I have a CFD code which uses PETSc and HYPRE. I found that for a certain >> case with grid size of 192,570,048, I encou

Re: [petsc-users] Scaling problem when cores > 600

2018-02-28 Thread TAY wee-beng
On 1/3/2018 10:07 AM, Matthew Knepley wrote: On Wed, Feb 28, 2018 at 9:01 PM, TAY wee-beng > wrote: On 1/3/2018 12:10 AM, Matthew Knepley wrote: On Wed, Feb 28, 2018 at 10:45 AM, TAY wee-beng mailto:zon...@gmail.com>> wrote: Hi, I have a CFD

Re: [petsc-users] Scaling problem when cores > 600

2018-02-28 Thread Smith, Barry F.
> On Feb 28, 2018, at 10:59 PM, TAY wee-beng wrote: > > > On 1/3/2018 10:07 AM, Matthew Knepley wrote: >> On Wed, Feb 28, 2018 at 9:01 PM, TAY wee-beng wrote: >> >> On 1/3/2018 12:10 AM, Matthew Knepley wrote: >>> On Wed, Feb 28, 2018 at 10:45 AM, TAY wee-beng wrote: >>> Hi, >>> >>> I have