[petsc-dev] Structure of Elemental changed?

2019-04-10 Thread Victor Eijkhout via petsc-dev
This rpm was make with 3.10.3:

root@build-BLDCHROOT:SPECS # rpm -qlp 
../RPMS/x86_64/tacc-petsc-intel18-impi18_0-package-3.10-4.el7.x86_64.rpm  | 
grep libElSuite
/home1/apps/intel18/impi18_0/petsc/3.10/skylake-debug/lib/libElSuiteSparse.so
/home1/apps/intel18/impi18_0/petsc/3.10/skylake-debug/lib/libElSuiteSparse.so.0
/home1/apps/intel18/impi18_0/petsc/3.10/skylake-debug/lib/libElSuiteSparse.so.87

This with (as best I can tell) 3.10.4:

root@build-BLDCHROOT:SPECS # rpm -qlp 
../RPMS/x86_64/tacc-petsc-intel18-impi18_0-package-3.10-5.el7.x86_64.rpm  | 
grep libElSuite
/home1/apps/intel18/impi18_0/petsc/3.10/skylake-debug/lib/libElSuiteSparse.so

Did something change in Elemental that it now generates only one library?

I think I now need to rebuild libmesh, because it seems to “know” that there is 
an .so.0 file.

Can I suggest that you limit such changes in libraries to minor version 
updates, and not point updates? I’m sort of working on the assumption that for 
our users I can install something as “petsc/3.10” and it keeps working for them 
if I make a point update in the installation.

Victor.





Re: [petsc-dev] Elemental & 64-bit int

2019-04-02 Thread Victor Eijkhout via petsc-dev
That seems to have fixed it. Thanks.

Victor.


> On Apr 1, 2019, at 8:02 PM, Balay, Satish  wrote:
> 
> On Tue, 2 Apr 2019, Victor Eijkhout via petsc-dev wrote:
> 
>> Configuring with elemental & 64-bit integers:
>> 
>> Cannot use elemental with 64 bit BLAS/Lapack indices
>> 
>> This used to work in 3.10. What changed?
> 
> with MKL?
> 
> Try:
> 
> https://bitbucket.org/petsc/petsc/pull-requests/1489/revert-commit-69ab9685009e-so-that-blas/diff#Lconfig/BuildSystem/config/packages/BlasLapack.pyF215T215
> 
> Satish



Re: [petsc-dev] Elemental & 64-bit int

2019-04-01 Thread Victor Eijkhout via petsc-dev


> On Apr 1, 2019, at 8:02 PM, Balay, Satish  wrote:
> 
> On Tue, 2 Apr 2019, Victor Eijkhout via petsc-dev wrote:
> 
>> Configuring with elemental & 64-bit integers:
>> 
>> Cannot use elemental with 64 bit BLAS/Lapack indices
>> 
>> This used to work in 3.10. What changed?
> 
> with MKL?

Yes.

> 
> Try:
> 
> https://bitbucket.org/petsc/petsc/pull-requests/1489/revert-commit-69ab9685009e-so-that-blas/diff#Lconfig/BuildSystem/config/packages/BlasLapack.pyF215T215
> 

Sorry, I’m not that much into git. Can you spell out how I get this applied and 
to what do I apply it to?

Victor.

> Satish



[petsc-dev] Elemental & 64-bit int

2019-04-01 Thread Victor Eijkhout via petsc-dev
Configuring with elemental & 64-bit integers:

Cannot use elemental with 64 bit BLAS/Lapack indices

This used to work in 3.10. What changed?

Victor.



Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-28 Thread Victor Eijkhout via petsc-dev


On Mar 27, 2019, at 8:30 AM, Matthew Knepley 
mailto:knep...@gmail.com>> wrote:

I think Satish now prefers

  --with-cc=${MPICH_HOME}/mpicc --with-cxx=${MPICH_HOME}/mpicxx 
--with-fc=${MPICH_HOME}/mpif90


That still requires with-mpi:

***
 UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
details):
---
Did not find package MPI needed by PTScotch.
Enable the package using --with-mpi
***

Makes sense, actually, but I ran into it.

Victor.



Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-27 Thread Victor Eijkhout via petsc-dev
module load mkl

use petsc options:

--with-blas-lapack-dir=${MKLROOT}

--with-cc=${MPICH_HOME}/mpicc --with-cxx=${MPICH_HOME}/mpicxx 
--with-fc=${MPICH_HOME}/mpif90

Sorry about the extraneous crap. I was cutting/pasting from my own much longer 
script.

V.
On Mar 27, 2019, at 8:47 AM, Mark Adams 
mailto:mfad...@lbl.gov>> wrote:

So is this the instructions that I should give him? This grad student is a 
quick study but he has not computing background. So we don't care what we use, 
we just want to work (easily).

Thanks

Do not use "--download-fblaslapack=1". Set it to 0. Same for 
"--download-mpich=1".

Now do:

> module load mkl

> export BLAS_LAPACK_LOAD=--with-blas-lapack-dir=${MKLROOT}

>  export PETSC_MPICH_HOME="${MPICH_HOME}"

And use

--with-cc=${MPICH_HOME}/mpicc --with-cxx=${MPICH_HOME}/mpicxx 
--with-fc=${MPICH_HOME}/mpif90

instead of clang++

On Wed, Mar 27, 2019 at 9:30 AM Matthew Knepley 
mailto:knep...@gmail.com>> wrote:
On Wed, Mar 27, 2019 at 8:55 AM Victor Eijkhout via petsc-dev 
mailto:petsc-dev@mcs.anl.gov>> wrote:
On Mar 27, 2019, at 7:29 AM, Mark Adams 
mailto:mfad...@lbl.gov>> wrote:

How should he configure to this? remove "--download-fblaslapack=1" and add 

1. If using gcc

module load mkl

with either compiler:

export BLAS_LAPACK_LOAD=--with-blas-lapack-dir=${MKLROOT}

2.  We define MPICH_HOME for you.

With Intel MPI:

  export PETSC_MPICH_HOME="${MPICH_HOME}/intel64"
  export mpi="--with-mpi-compilers=1 --with-mpi-include=${TACC_IMPI_INC} 
--with-mpi-lib=${TACC_IMPI_LIB}/release_mt/libmpi.so”

with mvapich:

  export PETSC_MPICH_HOME="${MPICH_HOME}"
  export mpi="--with-mpi-compilers=1 --with-mpi-dir=${PETSC_MPICH_HOME}”

(looks like a little redundancy in my script)

I think Satish now prefers

  --with-cc=${MPICH_HOME}/mpicc --with-cxx=${MPICH_HOME}/mpicxx 
--with-fc=${MPICH_HOME}/mpif90

  Thanks,

Matt

Victor.



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>



Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-27 Thread Victor Eijkhout via petsc-dev


On Mar 27, 2019, at 7:29 AM, Mark Adams 
mailto:mfad...@lbl.gov>> wrote:

How should he configure to this? remove "--download-fblaslapack=1" and add 

1. If using gcc

module load mkl

with either compiler:

export BLAS_LAPACK_LOAD=--with-blas-lapack-dir=${MKLROOT}

2.  We define MPICH_HOME for you.

With Intel MPI:

  export PETSC_MPICH_HOME="${MPICH_HOME}/intel64"
  export mpi="--with-mpi-compilers=1 --with-mpi-include=${TACC_IMPI_INC} 
--with-mpi-lib=${TACC_IMPI_LIB}/release_mt/libmpi.so”

with mvapich:

  export PETSC_MPICH_HOME="${MPICH_HOME}"
  export mpi="--with-mpi-compilers=1 --with-mpi-dir=${PETSC_MPICH_HOME}”

(looks like a little redundancy in my script)

Victor.



Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Victor Eijkhout via petsc-dev


On Mar 26, 2019, at 6:25 PM, Mark Adams via petsc-dev 
mailto:petsc-dev@mcs.anl.gov>> wrote:

/home1/04906/bonnheim/olympus-keaveny/Olympus/olympus.petsc-3.9.3.skx-cxx-O on 
a skx-cxx-O named 
c478-062.stampede2.tacc.utexas.edu 
with 4800 processors, by bonnheim Fri Mar 15 04:48:27 2019

I see you’re still using a petsc that uses the reference blas/lapack and 
ethernet instead of Intel OPA:

Configure Options: --configModules=PETSc.Configure 
--optionsModule=config.compilerOptions --with-cc++=clang++ COPTFLAGS="-g 
-mavx2" CXXOPTFLAGS="-g -mavx2" FOPTFLAGS="-g -mavx2" --download-mpich=1 
--download-hypre=1 --download-metis=1 --download-parmetis=1 --download-c2html=1 
--download-ctetgen --download-p4est=1 --download-superlu_dist 
--download-superlu --download-triangle=1 --download-hdf5=1 
--download-fblaslapack=1 --download-zlib --with-x=0 --with-debugging=0 
PETSC_ARCH=skx-cxx-O --download-chaco --with-viewfromoptions=1
Working directory: /home1/04906/bonnheim/petsc-3.9.3

I’ve alerted you guys about this months ago.

Victor.



Re: [petsc-dev] HYPRE_LinSysCore.h

2019-01-29 Thread Victor Eijkhout via petsc-dev


On Jan 29, 2019, at 3:58 PM, Balay, Satish 
mailto:ba...@mcs.anl.gov>> wrote:

-args.append('--without-fei')

The late-1990s Finite Element Interface?

I’ll enable it and see if anyone complains about it breaking whatever.

Victor.


[petsc-dev] HYPRE_LinSysCore.h

2019-01-29 Thread Victor Eijkhout via petsc-dev
I’ve been happily freeloading on the petsc installation in the sense that I 
claim to install things like hypre on our clusters by pointing into the petsc 
installation.

Until of course someone needs a bit that does not get installed by petsc.

In this case: HYPRE_LinSysCore.h

Does the petsc hypre installation pick and choose what parts of hypre to 
install? Can I exert some influence on this?

Victor.



[petsc-dev] SuperLU_dist 6.0.0 ?

2019-01-09 Thread Victor Eijkhout via petsc-dev
I can not find my previous correspondence with you guy sabout this topic.

Has SLU_DIST 6.0.0 been incorporated in the latest petsc?

%%
There is a new release on Superlu_DIST package version 6.0.0 (released on Sept 
23rd), where it improves strong scaling in triangular solve stage.
%%

Last time we talked there was some incompatibility that Sherry needed to 
address.

Victor.



[petsc-dev] Counter examples in project management

2018-11-19 Thread Victor Eijkhout via petsc-dev
[cid:B696A990-783C-4B8E-8849-53D11481DCDC@tacc.utexas.edu]
Sorry. I have to vent to *someone*.

V.