Re: [petsc-users] Problem running ex54f with GAMG

2022-03-18 Thread Mark Adams
MG is not stable with richardson/jacobi, or at least it is almost not stable. Yea I would guess gmres will go forever because it does not care. I think CG exists when it gets a negative number for beta or whatever and it was probably -eps. That is my guess. On Fri, Mar 18, 2022 at 3:17 PM Barry

Re: [petsc-users] Problem running ex54f with GAMG

2022-03-18 Thread Barry Smith
The GAMG produced an indefinite operator. I don't know if there is a way to detect why this happened or how to stop it. You can try -ksp_type gmres and see how that goes since it can handle an indefinite preconditioner. > On Mar 18, 2022, at 11:53 AM, Fabio Durastante > wrote: > >

Re: [petsc-users] Fwd: Problem running ex54f with GAMG

2022-03-18 Thread Mark Adams
On Fri, Mar 18, 2022 at 11:44 AM Fabio Durastante wrote: > Hi everybody, > > I'm trying to run the rotated anisotropy example ex54f using CG and GAMG > as preconditioner, I run it with the command: > > mpirun -np 2 ./ex54f -ne 1011 \ > -theta 18.0 \ > -epsilon 100.0 \ > -pc_type gamg \ >

Re: [petsc-users] Regarding the status of VecSetValues(Blocked) for GPU vectors

2022-03-18 Thread Matthew Knepley
On Fri, Mar 18, 2022 at 11:28 AM Sajid Ali Syed wrote: > Hi Matt/Mark, > > I'm working on a Poisson solver for a distributed PIC code, where the > particles are distributed over MPI ranks rather than the grid. Prior to the > solve, all particles are deposited onto a (DMDA) grid. > > The current

Re: [petsc-users] Regarding the status of VecSetValues(Blocked) for GPU vectors

2022-03-18 Thread Junchao Zhang
On Fri, Mar 18, 2022 at 10:28 AM Sajid Ali Syed wrote: > Hi Matt/Mark, > > I'm working on a Poisson solver for a distributed PIC code, where the > particles are distributed over MPI ranks rather than the grid. Prior to the > solve, all particles are deposited onto a (DMDA) grid. > > The current

Re: [petsc-users] Problem running ex54f with GAMG

2022-03-18 Thread Fabio Durastante
For the default case: Linear solve converged due to CONVERGED_ATOL iterations 14 for the other it tells Linear solve did not converge due to DIVERGED_INDEFINITE_PC iterations 2 that again seems a bit strange to me, since this should be a symmetric V-cycle built on smoothed aggregation that

Re: [petsc-users] Problem running ex54f with GAMG

2022-03-18 Thread Barry Smith
Run with -ksp_converged_reason to have it print why it has stopped the iteration. > On Mar 18, 2022, at 11:44 AM, Fabio Durastante > wrote: > > Hi everybody, > > I'm trying to run the rotated anisotropy example ex54f using CG and GAMG as > preconditioner, I run it with the command: >

[petsc-users] Fwd: Problem running ex54f with GAMG

2022-03-18 Thread Fabio Durastante
Hi everybody, I'm trying to run the rotated anisotropy example ex54f using CG and GAMG as preconditioner, I run it with the command: mpirun -np 2 ./ex54f -ne 1011 \ -theta 18.0 \ -epsilon 100.0 \ -pc_type gamg \ -pc_gamg_type agg \ -log_view \ -log_trace \ -ksp_view \ -ksp_monitor \ -ksp_type

Re: [petsc-users] Regarding the status of VecSetValues(Blocked) for GPU vectors

2022-03-18 Thread Sajid Ali Syed
Hi Matt/Mark, I'm working on a Poisson solver for a distributed PIC code, where the particles are distributed over MPI ranks rather than the grid. Prior to the solve, all particles are deposited onto a (DMDA) grid. The current prototype I have is that each rank holds a full size DMDA vector

[petsc-users] DMSwarm

2022-03-18 Thread Joauma Marichal
Hello, I am writing to you as I am trying to implement a Lagrangian Particle Tracking method to my eulerian solver that relies on a 3D collocated DMDA. I have been using examples to develop a first basic code. The latter creates particles on rank 0 with random coordinates on the whole domain