Hi,
We want to incorporate the PETSc into our in-house FEM package. We found two
ways to do the domain decomposition.
The first one is to read the mesh with partitioning, and the partitioning is
done with gmsh. For this one, we need to do the index mapping (renumbering). We
found that the mes
Barry
> On Jul 3, 2019, at 12:55 PM, Matthew Knepley via petsc-users
> wrote:
>
> On Wed, Jul 3, 2019 at 9:07 AM Dongyu Liu - CITG via petsc-users
> wrote:
> Thank you, Mark. My question is if I have some node indices like [0, 1, 3, 8,
> 9] handled by the current
>
*m,PetscInt<https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscInt.html#PetscInt>
*n)
To see what your local global indices are (rows m to n-1 are on this process
and so its the local indices are 0 to (n-m-1)).
Mark
On Wed, Jul 3, 2019 at 9:11 AM Dongyu Liu - CITG via p
Hi,
I am running a FEM program using PETSc. In the beginning, the mesh is
partitioned in gmsh, and we read the partitioned mesh using our own reader.
Now my question is: How can I get a global to local indices mapping? and do I
need to renumber the indices after I read the partitioned mesh?
Hi,
we are using the Viewer class in pets4py to read a gmsh file, but after we use
the function createASCII with the mode "READ", the gmsh file is emptied. Do you
have any clue why this happens.
Best,
Dongyu
Hi,
I am using PETSc to run FEM on domain decomposition. When I assemble the
matrix, I need to pre-allocate first to get better performance, can you
recommend how to preallocate in this case?
Right now we set big values for the number of nonzeros to preallocate memory,
but when we do so, the