Re: [petsc-users] Question about DMDAGetElements

2018-11-19 Thread Mark Adams via petsc-users
The local indices of the local mesh and local vectors, which includes ghost vertices. There are global-to-local methods to fill in ghost values and local-to-global methods to create global vectors that you can use for computation. On Mon, Nov 19, 2018 at 5:16 PM Sajid Ali wrote: > Bingo! > >

Re: [petsc-users] Question about DMDAGetElements

2018-11-19 Thread Sajid Ali via petsc-users
Bingo! So, DMDAGetElements gives the indices of the mesh, right ? Thank you !

Re: [petsc-users] Question about DMDAGetElements

2018-11-19 Thread Mark Adams via petsc-users
You seem to be confusing the degree of the mesh and the "degree" of the matrix and vector. A matrix is always N x M (2D if you like), a vector is always N (or 1 x N, or 1D if you like). The mesh in a DM or DA can be 1, 2 or 3D. On Mon, Nov 19, 2018 at 4:44 PM Sajid Ali via petsc-users <

Re: [petsc-users] Question about DMDAGetElements

2018-11-19 Thread Sajid Ali via petsc-users
So, DMDA is used for sparse matrices arising from FD/FE and MatCreateMPIAIJ can be used for dense matrices (though it is strongly discouraged). My confusion stemmed from DMDAGetElements giving the element indices for the 1D mesh/vector of size N(when the DMDACreate1d is used). But these indices

Re: [petsc-users] Question about DMDAGetElements

2018-11-19 Thread Smith, Barry F. via petsc-users
> On Nov 19, 2018, at 3:30 PM, Sajid Ali > wrote: > > I think what confused me was the fact that using DMCreateMatrix(da,) > created a 12x12 matrix with 144 elements but summing up nel*nen from each > rank gives only 2*2+3*2+3*2+3*2=20 elements. So this means that > DMDAGetElements

Re: [petsc-users] Question about DMDAGetElements

2018-11-19 Thread Sajid Ali via petsc-users
I think what confused me was the fact that using DMCreateMatrix(da,) created a 12x12 matrix with 144 elements but summing up nel*nen from each rank gives only 2*2+3*2+3*2+3*2=20 elements. So this means that DMDAGetElements returns the elements for the vector created on the mesh and this happens to

Re: [petsc-users] On unknown ordering

2018-11-19 Thread Appel, Thibaut via petsc-users
Hi Barry, > Le 15 nov. 2018 à 18:16, Smith, Barry F. a écrit : > > > >> On Nov 15, 2018, at 4:48 AM, Appel, Thibaut via petsc-users >> wrote: >> >> Good morning, >> >> I would like to ask about the importance of the initial choice of ordering >> the unknowns when feeding a matrix to

Re: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue

2018-11-19 Thread Mark Adams via petsc-users
> > > > Mark would have better comments on the scalability of the setup stage. > > The first thing to verify is that the algorithm is scaling. If you coarsen too slowly then the coarse grids get large, with many non-zeros per row, and the cost of the matrix triple product can explode. You can

Re: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue

2018-11-19 Thread Alberto F. Martín
Dear Matthew, using either "-build_twosided allreduce" or "-build_twosided redscatter" workarounds the Intel MPI internal error! Besides, with Intel MPI, the weak scaling figures look like: **preconditioner set up** [0.8865120411, 2.598261833, 3.780511856] **PCG stage** [0.5701429844,