On Mon, 26 Aug 2019 at 21:26, Smith, Barry F. <bsm...@mcs.anl.gov> wrote:
> > > > On Aug 26, 2019, at 10:11 AM, Lisandro Dalcin <dalc...@gmail.com> wrote: > > > > > > > > On Sun, 25 Aug 2019 at 18:37, Smith, Barry F. via petsc-dev < > petsc-dev@mcs.anl.gov> wrote: > > > > Metis is installed. > > > > config/PETSc/Configure.py: > self.addDefine('HAVE_'+i.PACKAGE.replace('-','_'), 1) # ONLY list package > if it is used directly by PETSc (and not only by another package) > > > > Since metis is not used by PETSc we don't set the PETSC_HAVE_METIS > flag because PETSc doesn't need it. > > > > > > This is not true, in DMPlex we have an explicit call to > METIS_PartGraphKway(). Of course, this code is protected with just > HAVE_PARMETIS, because if you have parmetis, then you have metis. > > Hmm, I think is should be protected with HAVE_METIS. There is nothing > that says PETSc can only be built with metis plus parmetis; it could be > built with just metis. > > Sorry, maybe I was not clear enough. The DMPlex code I'm talkinga about uses both ParMETIS and METIS, depending on the input graph being sequential or parallel. https://gitlab.com/petsc/petsc/blob/master/src/dm/impls/plex/plexpartition.c#L1799 So this code requires ParMETIS, and if you have it, then you have METIS, the a HAVE_METIS check seems a bit redundant to me. Do you still think we should change it? Or maybe we should use `#if defined(PETSC_HAVE_METIS) && defined(PETSC_HAVE_PARMETIS)`? Anything to add, Matthew? PS: In any case, I think configure should emit PETSC_HAVE_METIS, we are really using it. -- Lisandro Dalcin ============ Research Scientist Extreme Computing Research Center (ECRC) King Abdullah University of Science and Technology (KAUST) http://ecrc.kaust.edu.sa/