[petsc-users] SOLVE + PC combination for 7 point stencil (unstructured) poisson solution

2023-06-26 Thread Vanella, Marcos (Fed) via petsc-users
Hi, I was wondering if anyone has experience on what combinations are more efficient to solve a Poisson problem derived from a 7 point stencil on a single mesh (serial). I've been doing some tests of multigrid and cholesky on a 50^3 mesh. -pc_type mg takes about 75% more time than -pc_type chole

[petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Srikanth Sathyanarayana
Dear PETSc developers, I am currently working on a Gyrokinetic code where I essentially have to implement a block structured grid approach in one of the subdomains of the phase space coordinates. I have attached one such example in the x - v_parallel subdomains where I go from a full grid to

Re: [petsc-users] SOLVE + PC combination for 7 point stencil (unstructured) poisson solution

2023-06-26 Thread Matthew Knepley
On Mon, Jun 26, 2023 at 11:34 AM Vanella, Marcos (Fed) via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi, I was wondering if anyone has experience on what combinations are more > efficient to solve a Poisson problem derived from a 7 point stencil on a > single mesh (serial). > I've been doing

Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Matthew Knepley
On Mon, Jun 26, 2023 at 11:44 AM Srikanth Sathyanarayana wrote: > Dear PETSc developers, > > > I am currently working on a Gyrokinetic code where I essentially have to > implement a block structured grid approach in one of the subdomains of > the phase space coordinates. I have attached one such

Re: [petsc-users] SOLVE + PC combination for 7 point stencil (unstructured) poisson solution

2023-06-26 Thread Mark Adams
I'm not sure what MG is doing with an "unstructured" problem. I assume you are not using DMDA. -pc_type gamg should work I would configure with hypre and try that also: -pc_type hypre As Matt said MG should be faster. How many iterations was it taking? Try a 100^3 and check that the iteration coun

Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Barry Smith
> On Jun 26, 2023, at 11:44 AM, Srikanth Sathyanarayana > wrote: > > Dear PETSc developers, > > > I am currently working on a Gyrokinetic code where I essentially have to > implement a block structured grid approach in one of the subdomains of the > phase space coordinates. I have attache

Re: [petsc-users] SOLVE + PC combination for 7 point stencil (unstructured) poisson solution

2023-06-26 Thread Vanella, Marcos (Fed) via petsc-users
Than you Matt and Mark, I'll try your suggestions. To configure with hypre can I just use the --download-hypre configure line? That is what I did with suitesparse, very nice. From: Mark Adams Sent: Monday, June 26, 2023 12:05 PM To: Vanella, Marcos (Fed) Cc: pets

Re: [petsc-users] SOLVE + PC combination for 7 point stencil (unstructured) poisson solution

2023-06-26 Thread Matthew Knepley
On Mon, Jun 26, 2023 at 12:08 PM Vanella, Marcos (Fed) via petsc-users < petsc-users@mcs.anl.gov> wrote: > Than you Matt and Mark, I'll try your suggestions. To configure with hypre > can I just use the --download-hypre configure line? > Yes, Thanks, Matt > That is what I did with suite

Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Mark Adams
Let me backup a bit. I think you have an application that has a Cartesian, or a least fine, grid and you "have to implement a block structured grid approach". Is this block structured solver well developed? We have support for block structured (quad-tree) grids you might want to use. This is a comm

Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Barry Smith
> On Jun 26, 2023, at 5:12 PM, Srikanth Sathyanarayana > wrote: > > Dear Barry and Mark, > > Thank you very much for your response. > >>>The allocation for what? > What I mean is that, we don’t want additional memory allocations through DMDA > Vectors. I am not sure if it is even possib

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-06-26 Thread Alexander Lindsay
Returning to Sebastian's question about the correctness of the current LSC implementation: in the taxonomy paper that Jed linked to (which talks about SIMPLE, PCD, and LSC), equation 21 shows four applications of the inverse of the velocity mass matrix. In the PETSc implementation there are at most

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-06-26 Thread Alexander Lindsay
I guess that similar to the discussions about selfp, the approximation of the velocity mass matrix by the diagonal of the velocity sub-matrix will improve when running a transient as opposed to a steady calculation, especially if the time derivative is lumped Just thinking while typing On Mon,