Re: [petsc-users] Understanding index sets for PCGASM

2023-05-17 Thread Barry Smith
Yikes. Such huge numbers usually come from integer overflow or memory corruption. The code to decide on the memory that needs allocating is straightforward PetscErrorCode MatCreateSubMatrices_MPIAIJ(Mat C, PetscInt ismax, const IS isrow[], const IS iscol[], MatReuse scall, Mat *submat[]) {

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-17 Thread Leonardo Mutti
Thanks for the reply. Even without Valgrind (which I can't use since I'm on Windows), by further simplifying the example, I was able to have PETSc display a more informative message. What I am doing wrong and what should be done differently, this is still unclear to me. The simplified code runs

Re: [petsc-users] Using dmplexdistribute do parallel FEM code.

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 6:58 PM neil liu wrote: > Dear Petsc developers, > > I am writing my own code to calculate the FEM matrix. The following is my > general framework, > > DMPlexCreateGmsh(); > MPI_Comm_rank (Petsc_comm_world, ); > DMPlexDistribute (.., .., ); > > dm = dmDist; > //This can

[petsc-users] Using dmplexdistribute do parallel FEM code.

2023-05-17 Thread neil liu
Dear Petsc developers, I am writing my own code to calculate the FEM matrix. The following is my general framework, DMPlexCreateGmsh(); MPI_Comm_rank (Petsc_comm_world, ); DMPlexDistribute (.., .., ); dm = dmDist; //This can create separate dm s for different processors. (reordering.)

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 2:01 PM Berend van Wachem wrote: > Dear Matt, > > I tried it, but it doesn't seem to work. > Attached is a very small working example illustrating the problem. > I create a DMPlexBox Mesh, periodic in the Y direction. I then scale the Y > coordinates with a factor 10, and

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-17 Thread Barry Smith
> On May 17, 2023, at 11:10 AM, Leonardo Mutti > wrote: > > Dear developers, let me kindly ask for your help again. > In the following snippet, a bi-diagonal matrix A is set up. It measures 8x8 > blocks, each block is 2x2 elements. I would like to create the correct IS > objects for

Re: [petsc-users] Nested field split

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 3:23 PM Matthew Knepley wrote: > On Wed, May 17, 2023 at 2:59 PM Barry Smith wrote: > >> >> Absolutely, that is fundamental to the design. >> >> In the simple case where all the degrees of freedom exist at the same >> grid points, hence storage is like u,v,t,p in

Re: [petsc-users] Nested field split

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 2:59 PM Barry Smith wrote: > > Absolutely, that is fundamental to the design. > > In the simple case where all the degrees of freedom exist at the same > grid points, hence storage is like u,v,t,p in the vector the nesting is > trivial. You indicate the fields

Re: [petsc-users] Nested field split

2023-05-17 Thread Alexander Lindsay
Awesome, thanks Barry! On Wed, May 17, 2023 at 11:59 AM Barry Smith wrote: > > Absolutely, that is fundamental to the design. > > In the simple case where all the degrees of freedom exist at the same > grid points, hence storage is like u,v,t,p in the vector the nesting is > trivial. You

Re: [petsc-users] Nested field split

2023-05-17 Thread Barry Smith
Absolutely, that is fundamental to the design. In the simple case where all the degrees of freedom exist at the same grid points, hence storage is like u,v,t,p in the vector the nesting is trivial. You indicate the fields without using IS (don't even need to change any code)

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Berend van Wachem
Dear Matt, I tried it, but it doesn't seem to work. Attached is a very small working example illustrating the problem. I create a DMPlexBox Mesh, periodic in the Y direction. I then scale the Y coordinates with a factor 10, and add 1.0 to it. Both DMGetCoordinatesLocal and

[petsc-users] Nested field split

2023-05-17 Thread Alexander Lindsay
I've seen threads in the archives about nested field split but I'm not sure they match what I'm asking about. I'm doing a Schur field split for a porous version of incompressible Navier-Stokes. In addition to pressure and velocity fields, we have fluid and solid temperature fields. I plan to put

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 11:20 AM Berend van Wachem wrote: > Dear Matt, > > Is there a way to 'redo' the DMLocalizeCoordinates() ? Or to undo it? > Alternatively, can we make the calling of DMLocalizeCoordinates() in the > DMPlexCreate...() routines optional? > > Otherwise, we would have to copy

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Berend van Wachem
Dear Matt, Is there a way to 'redo' the DMLocalizeCoordinates() ? Or to undo it? Alternatively, can we make the calling of DMLocalizeCoordinates() in the DMPlexCreate...() routines optional? Otherwise, we would have to copy all arrays of coordinates from DMGetCoordinatesLocal() and

Re: [petsc-users] Large MATMPIAIJ - 32bit integer overflow in nz value

2023-05-17 Thread Barry Smith
Yeah, this is silly. The check is just a "sanity-check" on the data in the file. We store redundant information in the matrix header in the file, header[3] is the total number of nonzeros in the matrix. When nz is too large, the correct value cannot fit in the header. Changing the file

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-17 Thread Leonardo Mutti
Dear developers, let me kindly ask for your help again. In the following snippet, a bi-diagonal matrix A is set up. It measures 8x8 blocks, each block is 2x2 elements. I would like to create the correct IS objects for PCGASM. The non-overlapping IS should be: [*0,1*], [*2,3*],[*4,5*], ...,

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 10:21 AM Berend van Wachem wrote: > Dear Matt, > > Thanks for getting back to me so quickly. > > If I scale each of the coordinates of the mesh (say, I want to cube each > co-ordinate), and I do this for both: > > DMGetCoordinatesLocal(); > DMGetCellCoordinatesLocal(); >

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Berend van Wachem
Dear Matt, Thanks for getting back to me so quickly. If I scale each of the coordinates of the mesh (say, I want to cube each co-ordinate), and I do this for both: DMGetCoordinatesLocal(); DMGetCellCoordinatesLocal(); How do I know I am not cubing one coordinate multiple times? Thanks,

Re: [petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 10:02 AM Berend van Wachem wrote: > Dear PETSc Team, > > We are using DMPlex, and we create a mesh using > > DMPlexCreateBoxMesh ( ); > > and get a uniform mesh. The mesh is periodic. > > We typically want to "scale" the coordinates (vertices) of the mesh, and > to

[petsc-users] DMGetCoordinatesLocal and DMPlexGetCellCoordinates in PETSc > 3.18

2023-05-17 Thread Berend van Wachem
Dear PETSc Team, We are using DMPlex, and we create a mesh using DMPlexCreateBoxMesh ( ); and get a uniform mesh. The mesh is periodic. We typically want to "scale" the coordinates (vertices) of the mesh, and to achieve this, we call DMGetCoordinatesLocal(dm, ); and scale the entries

Re: [petsc-users] Large MATMPIAIJ - 32bit integer overflow in nz value

2023-05-17 Thread Matthew Knepley
On Wed, May 17, 2023 at 9:02 AM Fleischli Benno HSLU T < benno.fleisc...@hslu.ch> wrote: > Dear PETSc developers > > I am creating a very large parallel sparse matrix (MATMPIAIJ) with PETSc. > I write this matrix to disk. > The number of non-zeros exceeds the maximum number a 32-bit integer can >

[petsc-users] Large MATMPIAIJ - 32bit integer overflow in nz value

2023-05-17 Thread Fleischli Benno HSLU T
Dear PETSc developers I am creating a very large parallel sparse matrix (MATMPIAIJ) with PETSc. I write this matrix to disk. The number of non-zeros exceeds the maximum number a 32-bit integer can hold. When I read the matrix from disk i get an error because there was an overflow in the nz