> On 13 Apr 2020, at 4:42 PM, Matthew Knepley <knep...@gmail.com> wrote: > > On Mon, Apr 13, 2020 at 6:14 AM Pierre Jolivet <pierre.joli...@enseeiht.fr > <mailto:pierre.joli...@enseeiht.fr>> wrote: > Hello, > I’m trying to figure out why the result of DMGetNeighbors when a DMPlex is > distributed with no overlap can be nonsymmetric. > > Right now, the Plex function only calls PetscSFGetRootRanks(). It should > probably call PetscSFGetLeafRanks() as well and merge the lists. > Have time to make the fix? You could commit this nice test :)
Sure, I’ll try to give it a go. Thanks, Pierre > Thanks, > > Matt > > Is this intended? If so, is there some other easy way to get the equivalent > list as in the overlapping case (I’m interested in the list of ranks sharing > at least one point)? > Here is a MWE. > $ patch -p1 < patch.txt > $ cd src/dm/impls/plex/tests/ > $ make ex1 > $ mpirun -n 2 ./ex1 -dim 2 -cell_simplex 0 -interpolate -domain_shape box > -domain_box_sizes 3,3 -test_redistribute > 1 <— how come rank 0 has 1 neighbor > 0 <— but rank 1 has 0 neighbor? > $ mpirun -n 2 ./ex1 -dim 2 -cell_simplex 0 -interpolate -domain_shape box > -domain_box_sizes 3,3 -overlap 1 > 1 > 1 > Overlap: 1 > > Thanks, > Pierre > > diff --git a/src/dm/impls/plex/tests/ex1.c b/src/dm/impls/plex/tests/ex1.c > index 4a5d051a66..ac2ce8eb20 100644 > --- a/src/dm/impls/plex/tests/ex1.c > +++ b/src/dm/impls/plex/tests/ex1.c > @@ -3,0 +4 @@ static char help[] = "Tests various DMPlex routines to > construct, refine and dis > +#include <petscsf.h> > @@ -420,0 +422,8 @@ PetscErrorCode CreateMesh(MPI_Comm comm, AppCtx *user, DM > *dm) > + PetscInt nranks; > + const PetscInt* ranks; > + PetscSF sf; > + ierr = DMGetPointSF(distributedMesh, &sf);CHKERRQ(ierr); > + ierr = PetscSFSetUp(sf);CHKERRQ(ierr); > + ierr = DMGetNeighbors(distributedMesh, &nranks, &ranks);CHKERRQ(ierr); > + ierr = > PetscSynchronizedPrintf(PETSC_COMM_WORLD,"%D\n",nranks);CHKERRQ(ierr); > + ierr = > PetscSynchronizedFlush(PETSC_COMM_WORLD,PETSC_STDOUT);CHKERRQ(ierr); > @@ -434,0 +444,8 @@ PetscErrorCode CreateMesh(MPI_Comm comm, AppCtx *user, DM > *dm) > + PetscInt nranks; > + const PetscInt* ranks; > + PetscSF sf; > + ierr = DMGetPointSF(overlapMesh, &sf);CHKERRQ(ierr); > + ierr = PetscSFSetUp(sf);CHKERRQ(ierr); > + ierr = DMGetNeighbors(overlapMesh, &nranks, &ranks);CHKERRQ(ierr); > + ierr = > PetscSynchronizedPrintf(PETSC_COMM_WORLD,"%D\n",nranks);CHKERRQ(ierr); > + ierr = > PetscSynchronizedFlush(PETSC_COMM_WORLD,PETSC_STDOUT);CHKERRQ(ierr); > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>