Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Barry Smith
> On Jun 26, 2023, at 5:12 PM, Srikanth Sathyanarayana > wrote: > > Dear Barry and Mark, > > Thank you very much for your response. > >>>The allocation for what? > What I mean is that, we don’t want additional memory allocations through DMDA > Vectors. I am not sure if it is even

Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Mark Adams
Let me backup a bit. I think you have an application that has a Cartesian, or a least fine, grid and you "have to implement a block structured grid approach". Is this block structured solver well developed? We have support for block structured (quad-tree) grids you might want to use. This is a

Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Barry Smith
> On Jun 26, 2023, at 11:44 AM, Srikanth Sathyanarayana > wrote: > > Dear PETSc developers, > > > I am currently working on a Gyrokinetic code where I essentially have to > implement a block structured grid approach in one of the subdomains of the > phase space coordinates. I have

Re: [petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Matthew Knepley
On Mon, Jun 26, 2023 at 11:44 AM Srikanth Sathyanarayana wrote: > Dear PETSc developers, > > > I am currently working on a Gyrokinetic code where I essentially have to > implement a block structured grid approach in one of the subdomains of > the phase space coordinates. I have attached one such

[petsc-users] Using DMDA for a block-structured grid approach

2023-06-26 Thread Srikanth Sathyanarayana
Dear PETSc developers, I am currently working on a Gyrokinetic code where I essentially have to implement a block structured grid approach in one of the subdomains of the phase space coordinates. I have attached one such example in the x - v_parallel subdomains where I go from a full grid to