Re: [petsc-users] Parallel DMPlex

2023-10-16 Thread erdemguer via petsc-users
gt; Number of 3-cells per rank: 13 14 >>>>>>>>> Labels: >>>>>>>>> depth: 4 strata with value/size (0 (40), 1 (83), 2 (57), 3 (13)) >>>>>>>>> marker: 1 strata with value/size (1 (109)) >>>>>>>>> Face Set

Re: [petsc-users] Parallel DMPlex

2023-10-16 Thread erdemguer via petsc-users
ata with value/size (1 (109)) >>>>>>> Face Sets: 5 strata with value/size (1 (6), 2 (1), 3 (7), 5 (5), 6 (4)) >>>>>>> celltype: 4 strata with value/size (0 (40), 1 (83), 4 (57), 7 (13)) >>>>>>> Field Field_0: adjacency FEM >>>>

Re: [petsc-users] Parallel DMPlex

2023-10-16 Thread erdemguer via petsc-users
mensions: >>>>> Number of 0-cells per rank: 64 60 >>>>> Number of 1-cells per rank: 144 133 >>>>> Number of 2-cells per rank: 108 98 >>>>> Number of 3-cells per rank: 27 24 >>>>> Labels: >>>>>

Re: [petsc-users] Parallel DMPlex

2023-10-13 Thread erdemguer via petsc-users
xCreateBoxMesh >>>>> >>>>> - DMSetFromOptions >>>>> - PetscSectionCreate >>>>> - PetscSectionSetNumFields >>>>> - PetscSectionSetFieldDof >>>>> >>>>> - PetscSectionSetDof >>>>> >>>>> - Pet

Re: [petsc-users] Parallel DMPlex

2023-10-11 Thread erdemguer via petsc-users
MPlexView() for each incarnation of the mesh. >>> What I do is put >>> >>> DMViewFromOptions(dm, NULL, "-dm1_view") >>> >>> with a different string after each call. >>> >>>> But I couldn't figure out how to decide where the ghost/pr

Re: [petsc-users] Parallel DMPlex

2023-10-11 Thread erdemguer via petsc-users
uteCellTypes before DMPlexGetCellTypeStratum but nothing changed. >> I think I can calculate the ghost cell indices using cStart/cEnd before & >> after distribution but I think there is a better way I'm currently missing. >> >> Thanks again, >> Guer. >> >

Re: [petsc-users] Parallel DMPlex

2023-10-10 Thread erdemguer via petsc-users
anks again, Guer. --- Original Message --- On Thursday, September 28th, 2023 at 10:42 PM, Matthew Knepley wrote: > On Thu, Sep 28, 2023 at 3:38 PM erdemguer via petsc-users > wrote: > >> Hi, >> >> I am currently using DMPlex in my code. It runs serially at the

[petsc-users] Parallel DMPlex

2023-09-28 Thread erdemguer via petsc-users
Hi, I am currently using DMPlex in my code. It runs serially at the moment, but I'm interested in adding parallel options. Here is my workflow: Create a DMPlex mesh from GMSH. Reorder it with DMPlexPermute. Create necessary pre-processing arrays related to the mesh/problem. Create field(s) with