On 10/6/25 20:10, Matthew Knepley wrote:
On Mon, Oct 6, 2025 at 1:11 PM Aldo Bonfiglioli
<[email protected]> wrote:
Dear all,
what is the best approach for defining vectors that "sit" on the
(vertices and/or faces) of a given stratum of the "Face Sets" of a
DMPlex?
DM Object: 3D plex 1 MPI process
type: plex
3D plex in 3 dimensions:
Number of 0-cells per rank: 9261
Number of 1-cells per rank: 59660
Number of 2-cells per rank: 98400
Number of 3-cells per rank: 48000
Labels:
marker: 1 strata with value/size (1 (14402))
celltype: 4 strata with value/size (0 (9261), 1 (59660), 3
(98400), 6 (48000))
depth: 4 strata with value/size (0 (9261), 1 (59660), 2 (98400),
3 (48000))
Face Sets: 6 strata with value/size (1 (800), 2 (800), 3 (800),
4 (800), 5 (800), 6 (800))
These vectors are going to be used (for example) to store stresses
and heat flux on solid surfaces.
To be more specific: suppose stratum 3 of the "Face Sets" is a
solid wall.
I want to create a vector that that stores quantities computed on
the (800) faces of that wall OR the vertices of that wall.
It should be simple to just create such vectors. You request a submesh
using that label
DM subdm;
DMLabel label, sublabel;
DMGetLabel(dm, "Face Sets", &label);
DMLabelDuplicate(label, &sublabel);
DMPlexLabelComplete(dm, sublabel);
DMPlexCreateSubmesh(dm, sublabel, 3, PETSC_TRUE, &subdm)
DMLabelDestroy(&sublabel);
Now you can define a PetscFE over this submesh in the same way as any
other mesh. Moreover, the subdm contains a mapping back to the
original DM, from which you can create a mapping of dofs, so that you
can inject the subvector into a larger field if you wish.
If you want to use fields on submeshes inside a PetscDS, so that the
Plex manages the solve, the procedure is slightly different, but I can
detail it if you want.
Thanks,
Matt
Thanks,
Aldo
--
Dr. Aldo Bonfiglioli
Associate professor of Fluid Mechanics
Dipartimento di Ingegneria
Universita' della Basilicata
V.le dell'Ateneo Lucano, 10 85100 Potenza ITALY
tel:+39.0971.205203 fax:+39.0971.205215
web:https://urldefense.us/v3/__http://docenti.unibas.it/site/home/docente.html?m=002423__;!!G_uCfscf7eWS!Yk-uGKnWGfd4b87jjp07cII-ta6Rld0p_kmImii1ZuYnG2DCFwjpUQkeQ8z3mrWOHPd6-oKoBkZsW8rkoKSWFblHNxPyamZ0CHI$
<https://urldefense.us/v3/__http://docenti.unibas.it/site/home/docente.html?m=002423__;!!G_uCfscf7eWS!dx5g28NqJW34oxLLKP1Fjtp65c0KkvUjelPzjza0lBJtf6uu5ROFqpa2GTX5Cle8L7S_YjHssSDqe6szXd2PEYvYVHHq5mtW8EU$>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener
https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!Yk-uGKnWGfd4b87jjp07cII-ta6Rld0p_kmImii1ZuYnG2DCFwjpUQkeQ8z3mrWOHPd6-oKoBkZsW8rkoKSWFblHNxPykk2-vZU$
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!Yk-uGKnWGfd4b87jjp07cII-ta6Rld0p_kmImii1ZuYnG2DCFwjpUQkeQ8z3mrWOHPd6-oKoBkZsW8rkoKSWFblHNxPy-jdNsBo$ >
Matthew,
I followed your suggestions, but I face a deadlock when the enclosed
demonstrator is run in parallel (for the attached dotfile, deadlock
occurs when nproc = 3).
I suspect it might be due to the fact that in a parallel environment the
various strata of the "Face Sets" are not necessarily available to all
processes.
Therefore, not all processes are going to call DMPlexCreateSubmesh with
a given "value".
The attached piece of code links with petsc-3.24.0.
Thanks,
Aldo
--
Dr. Aldo Bonfiglioli
Associate professor of Fluid Mechanics
Dipartimento di Ingegneria
Universita' della Basilicata
V.le dell'Ateneo Lucano, 10 85100 Potenza ITALY
tel:+39.0971.205203 fax:+39.0971.205215
web:https://urldefense.us/v3/__http://docenti.unibas.it/site/home/docente.html?m=002423__;!!G_uCfscf7eWS!Yk-uGKnWGfd4b87jjp07cII-ta6Rld0p_kmImii1ZuYnG2DCFwjpUQkeQ8z3mrWOHPd6-oKoBkZsW8rkoKSWFblHNxPyamZ0CHI$
program read_gmsh_dmplex
#include "petsc/finclude/petscdmplex.h"
#include "petsc/finclude/petscdmlabel.h"
#include "petsc/finclude/petscds.h"
use petscdm
implicit none
type(tDM) :: dm
type(tDM), allocatable :: subdm(:)
DMLabel :: label, sublabel
IS :: values
integer :: ierr
Vec :: u
character(len=256) :: filename
character(len=8) :: stringa = "patch_nn"
character(len=7) :: objname
PetscBool :: markedFaces
integer :: i, j
integer :: my_pe, numprocs
PetscInt :: ival, nstrata
PetscReal :: r
PetscInt, pointer :: idx(:)
PetscInt :: ndim
PetscViewer :: viewer
character(len=PETSC_MAX_PATH_LEN) :: IOBuffer
! Initialize PETSc
call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
if (ierr /= 0) stop 'Error initializing PETSc'
call MPI_Comm_rank(PETSC_COMM_WORLD, my_pe, ierr)
call MPI_Comm_size(PETSC_COMM_WORLD, numprocs, ierr)
!
!
! Create DM from command-line options
!
PetscCall(DMPlexCreate(PETSC_COMM_WORLD, dm, ierr))
PetscCall(DMSetFromOptions(dm, ierr))
PetscCall(DMGetDimension(dm, ndim, ierr))
objname = "0D plex"
write (objname(1:1), FMT="(I1.1)") ndim
PetscCall(PetscObjectSetName(dm, trim(objname), ierr))
! PetscCall(DMViewFromOptions(dm, PETSC_NULL_VEC, '-dm_view', ierr))
PetscCall(DMSetup(dm, ierr))
! Distribute the mesh for parallel processing
! if (numprocs .gt. 1) then
! call DMPlexDistribute(dm, 0, PETSC_NULL_SF, dm, ierr)
! if (ierr /= 0) stop 'Error distributing mesh'
! end if
! View the mesh (optional)
call DMView(dm, PETSC_VIEWER_STDOUT_WORLD, ierr)
PetscCallA(DMGetLabel(dm, "Face Sets", label, ierr))
if (PetscObjectIsNull(label)) then
write (IOBuffer, *) "Face Sets returned a NULL label\n"
stop
else
write (IOBuffer, *) "Face Sets returned a GOOD label\n"
end if
PetscCallA(PetscPrintf(PETSC_COMM_WORLD, IOBuffer, ierr))
!
!
! PetscCallA(DMLabelView(label, PETSC_VIEWER_STDOUT_SELF, ierr))
!
PetscCallA(DMLabelGetValueIS(label, values, ierr))
! PetscCallA(ISView(values, PETSC_VIEWER_STDOUT_SELF, ierr))
PetscCallA(ISGetSize(values, nstrata, ierr))
markedFaces = PETSC_FALSE
markedFaces = PETSC_TRUE
allocate (subdm(nstrata))
PetscCallA(ISGetIndices(values, idx, ierr))
write (IOBuffer, FMT=*) "[", my_pe, "] found these ", nstrata, " strata in Face Sets: ", (idx(j), j=1, nstrata), "\n\n\n"
PetscCall(PetscSynchronizedPrintf(PETSC_COMM_WORLD, IOBuffer, ierr))
PetscCall(PetscSynchronizedFlush(PETSC_COMM_WORLD, PETSC_STDOUT, ierr))
do i = 1, nstrata ! this is proc-dependent in the parallel case
ival = idx(i)
write (stringa(7:8), fmt="(I2.2)") ival
write (IOBuffer, FMT=*) "[", my_pe, "] now processing ", stringa, "\n"
PetscCallA(PetscPrintf(PETSC_COMM_SELF, IOBuffer, ierr))
PetscCallA(DMLabelDuplicate(label, sublabel, ierr))
PetscCallA(DMPlexLabelComplete(dm, sublabel, ierr))
PetscCallA(DMPlexCreateSubmesh(dm, sublabel, ival, markedFaces, subdm(i), ierr))
PetscCallA(DMLabelDestroy(sublabel, ierr))
!
call createsectionalternate(subdm(i))
call DMView(subdm(i), PETSC_VIEWER_STDOUT_WORLD, ierr)
!
end do
#if 0
do i = 1, nstrata
PetscCallA(DMCreateGlobalVector(subdm(i), u, ierr))
r = real(ival)
PetscCallA(VecSet(u, 0.d0, ierr))
filename = stringa//".vtu"
write (IOBuffer, FMT=*) "Backing up the solution to ", trim(filename), "\n"
PetscCallA(PetscPrintf(PETSC_COMM_WORLD, IOBuffer, ierr))
PetscCallA(PetscViewerCreate(PETSC_COMM_WORLD, viewer, ierr))
PetscCallA(PetscViewerSetType(viewer, PETSCVIEWERVTK, ierr))
PetscCallA(PetscViewerFileSetName(viewer, trim(filename), ierr))
PetscCallA(VecView(u, viewer, ierr))
PetscCallA(PetscViewerDestroy(viewer, ierr))
end do
#endif
PetscCallA(ISRestoreIndices(values, idx, ierr))
! Destroy DMPlex object
call DMDestroy(dm, ierr)
if (ierr /= 0) stop 'Error destroying DMPlex'
do i = 1, nstrata
PetscCallA(DMDestroy(subdm(i), ierr))
end do
deallocate (subdm)
! Finalize PETSc
call PetscFinalize(ierr)
if (ierr /= 0) stop 'Error finalizing PETSc'
contains
!
subroutine CreateSectionAlternate(dm)
! this subroutine is taken from https://petsc.org/release/src/dm/impls/plex/tutorials/ex1f90.F90.html
#include "petsc/finclude/petscdmplex.h"
#include "petsc/finclude/petscdmlabel.h"
use petscdmplex
use petscsys
implicit none
DM :: dm
PetscSection :: section
PetscInt :: ndim, numFields, numBC
PetscInt :: i
PetscInt, target, dimension(3) :: numComp
PetscInt, pointer :: pNumComp(:)
PetscInt, target, dimension(12) :: numDof
PetscInt, pointer :: pNumDof(:)
PetscInt, parameter :: zero = 0, one = 1, two = 2
PetscInt :: f, d
PetscErrorCode :: ierr
character(len=PETSC_MAX_PATH_LEN) :: IOBuffer
PetscMPIInt :: my_pe
logical, parameter :: verbose = .false.
call MPI_Comm_rank(PETSC_COMM_WORLD, my_pe, ierr)
PetscCall(DMGetDimension(dm, ndim, ierr))
! Create a scalar field u, a vector field v, and a surface vector field w
numFields = 1
numComp(1) = 1
! numComp(2) = ndim
! numComp(3) = ndim-1
pNumComp => numComp
do i = 1, numFields*(ndim + 1)
numDof(i) = 0
end do
! numDof[f*(dim+1)+d] gives the number of dof for field f on points of dimension d. For instance, numDof[1] is the number of dof for field 0 on each edge.
! il punto di dimensione d=0 è il vertice
! il punto di dimensione d=1 è il lato (edge)
! il punto di dimensione d=ndim-1 sono le facce; in 2D coincidono con gli edge
! il punto di dimensione d=ndim è la cella; in 2D coincidono con le facce
!
! dim=1 | 2 | 3 |
!------------+-------------+-------------+
! 1D | 2D | 3D |
!------------+-------------+-------------+
! vertex vertex vertex | d=0
!------------+-------------+-------------+
! edge edge edge | d=1
!------------+-------------+-------------+
! vertex edge triangle | d=dim-1
!------------+-------------+-------------+
! edge triangle tetra | d=dim
!------------+-------------+-------------+
!
!
f = numFields - 1 ! field #
d = 0 ! vertices
!
! numDof[f*(dim+1)+d] gives the number of dof for field f on points of dimension d. For instance, numDof[1] is the number of dof for field 0 on each edge.
!
! Let u be defined on vertices
numDof(f*(ndim + 1) + d + 1) = 1
! ^
! |
! |
! fortran 1-based
!
! Let v be defined on cells
! numDof(1*(ndim+1)+ndim+1) = ndim
! Let v be defined on faces
! numDof(2*(ndim+1)+ndim) = ndim-1
do i = 1, numFields*(ndim + 1)
write (IOBuffer, *) "numDof(", i, ")= ", numDof(i), "\n"
PetscCall(PetscPrintf(PETSC_COMM_WORLD, IOBuffer, ierr))
end do
pNumDof => numDof
!
! which stratum should I select?
!
! Get the nof points in depth=1 that are marked; should be 2 in 1D
!
! numBC - The number of boundary conditions
!
numBC = 0 ! Bcs are handled by zeroing selected rows
!
! bcField - An array of size numBC giving the field number for each boundary condition
!
! bcComps - [Optional] An array of size numBC giving an IS holding the field components to which each boundary condition applies
!
! bcPoints - An array of size numBC giving an IS holding the DMPLEX points to which each boundary condition applies
!
! Create a PetscSection with this data layout
PetscCall(DMSetNumFields(dm, numFields, ierr))
! PetscErrorCode DMPlexCreateSection(DM dm, DMLabel label[], const PetscInt numComp[], const PetscInt numDof[], PetscInt numBC, const PetscInt bcField[], const IS bcComps[], const IS bcPoints[], IS perm, PetscSection *section)
! PetscCall(DMPlexCreateSection(dm,nolabel,pNumComp,pNumDof,numBC,pBcField,pBcCompIS,pBcPointIS,PETSC_NULL_IS,section,ierr))
PetscCall(DMPlexCreateSection(dm,PETSC_NULL_DMLABEL_ARRAY,numComp,numDof,numBC,PETSC_NULL_INTEGER_ARRAY,PETSC_NULL_IS_ARRAY,PETSC_NULL_IS_ARRAY,PETSC_NULL_IS,section, ierr))
! PetscCall(DMPlexCreateSection(dm,nolabel,pNumComp,pNumDof,numBC,pBCField,PETSC_NULL_IS,PETSC_NULL_IS,PETSC_NULL_IS,section,ierr))
! Name the Field variables
PetscCall(PetscSectionSetFieldName(section, zero, "Roe's parameter vector", ierr))
if (verbose) PetscCall(PetscSectionView(section, PETSC_VIEWER_STDOUT_WORLD, ierr))
! Tell the DM to use this data layout
PetscCall(DMSetLocalSection(dm, section, ierr))
! Cleanup
PetscCall(PetscSectionDestroy(section, ierr))
return
end subroutine CreateSectionAlternate
end program read_gmsh_dmplex
-dm_plex_dim 3
-dm_plex_shape box
-dm_plex_box_faces 20,20,20
-dm_plex_box_lower 0.,0.,0.
-dm_plex_box_upper 1.,1.,1.
##-dm_plex_filename cube6.msh
-dm_plex_simplex true
-dm_plex_interpolate
-dm_plex_check_all
#-dm_view
##-dm_plex_view_labels "marker"
##-dm_plex_view_labels "Face Sets"
-petscpartitioner_view
####-dm_petscsection_view
-options_left