> On 22 Aug 2019, at 7:42 AM, Balay, Satish <ba...@mcs.anl.gov> wrote: > > On Thu, 22 Aug 2019, Pierre Jolivet via petsc-dev wrote: > >> Hello, >> PETSc is linking “sequential” libraries with MPI libraries. >> $ otool -L libmetis.dylib >> /usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, >> current version 14.7.0) >> $ otool -L libfftw3.dylib >> /usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, >> current version 14.7.0) > > this will occur if one uses mpi compilers to build PETSc. Why, though? If MPICXX_SHOW != “Unavailable”, is it mandatory to force CXX=MPICXX in Metis CMake? Wouldn’t it be possible to just extract the compiler binary name and use that as CXX? I understand you don’t want to either overcomplicate things or fix something that is not broken — for you — so I’m just making sure that it would be OK if I patch this like that locally. >> Is there anyway to avoid this, by using a “sequential” compiler and/or >> linker? > > Yes - you can build these (sequential) packages/petsc with --with-mpi=0 [and > without mpi compilers] > >> I’m asking because we use PETSc libraries to compile both parallel and >> sequential wrappers. >> Our Metis wrapper is marked as a sequential one, but since you are linking >> libmetis with MPI, this is problematic for some configurations. > > Not sure what What mean by 'wrappers' here - esp 'Metis wrapper'. Its > just a library. It’s just another dynamic library compiled on top of libmetis that is then dynamically loaded by a DSL, which may or may not be launched with MPIRUN. > If you are using petsc build tools to install these packages for a > different use [other than the petsc usage specified by configure] - > use different petsc builds as indicated above for different packages - > as needed. Having to configure + build PETSc with both real and complex numbers is already long enough. That would mean a 3rd build, but why not. Are there some guarantees that CXX with --with-mpi=0 will be the same as the underlying compiler of MPICXX? (I’m thinking of incompatible libc++ that would make it impossible to link in the same library Metis and the later one compiled PETSc with --with-mpi=1) Thanks, Pierre > BTW: Current petsc configure/builder builds only parallel fftw. [it does not > support building sequential fftw. But I guess this could be added] > > Satish
Re: [petsc-dev] Sequential external packages and MPI
Pierre Jolivet via petsc-dev Wed, 21 Aug 2019 23:04:11 -0700
- [petsc-dev] Sequential external packages and... Pierre Jolivet via petsc-dev
- Re: [petsc-dev] Sequential external pac... Balay, Satish via petsc-dev
- Re: [petsc-dev] Sequential external... Pierre Jolivet via petsc-dev
- Re: [petsc-dev] Sequential exte... Smith, Barry F. via petsc-dev
- Re: [petsc-dev] Sequential ... Pierre Jolivet via petsc-dev
- Re: [petsc-dev] Sequential ... Balay, Satish via petsc-dev
- Re: [petsc-dev] Sequential exte... Balay, Satish via petsc-dev
- Re: [petsc-dev] Sequential exte... Balay, Satish via petsc-dev
- Re: [petsc-dev] Sequential external pac... Smith, Barry F. via petsc-dev
- Re: [petsc-dev] Sequential external... Jed Brown via petsc-dev