Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Kun Jiao via petsc-users
Hi Richard,

Understood! Thanks very much for you advice.

Regards,
Kun




Schlumberger-Private
From: Mills, Richard Tran 
Sent: Tuesday, March 26, 2019 8:11 PM
To: petsc-users@mcs.anl.gov
Cc: Kun Jiao 
Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

Hi Kun,

I'm the author of most of the AIJMKL stuff in PETSc. My apologies for having 
inadvertently omitted these function prototypes for these interfaces; I'm glad 
that Satish's patch has fixed this.

I want to point out that -- though I can envision some scenarios in which one 
would want to use the MatCreateXXXAIJMKL interfaces -- most of the time I would 
recommend against using these directly. Instead, I would recommend simply 
creating AIJ matrices and then setting them to the AIJMKL sub-types via the 
PETSc options database. (Via the command line, this could be done by specifying 
something like "-mat_seqaij_type seqaijmkl" to indicate that all of the 
"sequential" AIJ matrices that make up an "MPI" AIJ matrix should be of type 
SEQAIJMKL.) Because this is how I usually do things, my testing had not 
uncovered the missing function prototypes.

Best regards,
Richard

On 3/26/19 2:37 PM, Kun Jiao via petsc-users wrote:

And yes, by applying the patch in the petscmat.h, everything works.

Thanks for the help.



Regards,

Kun







Schlumberger-Private



-Original Message-

From: Balay, Satish 

Sent: Tuesday, March 26, 2019 3:42 PM

To: Kun Jiao 

Cc: petsc-users 

Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4



Please apply the patch I sent earlier and retry.



Satish



On Tue, 26 Mar 2019, Kun Jiao via petsc-users wrote:



Strange things, when I compile my code in the test dir in PETSC, it works. 
After I "make install" PETSC, and try to compile my code against the installed 
PETSC, it doesn't work any more.



I guess this is what you means.



Is there any way to reenable MatCreateMPIAIJMKL public interface?



And, I am using intel MKL, here is my configure option:



Configure Options: --configModules=PETSc.Configure

--optionsModule=config.compilerOptions PETSC_ARCH=linux-gnu-intel

--with-precision=single --with-cc=mpiicc --with-cxx=mpiicc

--with-fc=mpiifort

--with-mpi-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/softwa

re/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include

--with-mpi-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/softwar

e/intel//compilers_and_libraries_2019.2.187/linux/mpi/intel64/lib

-lmpifort -lmpi_ilp64"

--with-blaslapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/

software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel6

4 -Wl, --no-as-needed -lmkl_intel_lp64 -lmkl_sequential -lmkl_core

-lpthread -lm -ldl"

--with-scalapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/s

oftware/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64

-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64"

--with-scalapack-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/

software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include

--with-mkl_pardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/so

ftware/intel/compilers_and_libraries_2019.2.187/linux/mkl

--with-mkl_sparse=1

--with-mkl_sparse-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/sof

tware/intel/compilers_and_libraries_2019.2.187/linux/mkl

--with-mkl_cpardiso=1

--with-mkl_cpardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/s

oftware/intel/compilers_and_libraries_2019.2.187/linux/mkl

--with-mkl_sparse_optimize=1

--with-mkl_sparse_optimize-dir=/wgdisk/hy3300/source_code_dev/imaging/

kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl

--with-mkl_sparse_sp2m=1

--with-mkl_sparse_sp2m-dir=/wgdisk/hy3300/source_code_dev/imaging/kjia

o/software/intel/compilers_and_libraries_2019.2.187/linux/mkl

--with-cmake=1

--prefix=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/petsc_3

.9.4 --known-endian=big --with-debugging=0 --COPTFLAGS=" -Ofast

-xHost" --CXXOPTFLAGS=" -Ofast -xHost" --FOPTFLAGS=" -Ofast -xHost"

--with-x=0 Working directory:

/wgdisk/hy3300/source_code_dev/imaging/kjiao/petsc-3.10.4







Schlumberger-Private



-Original Message-

From: Balay, Satish 

Sent: Tuesday, March 26, 2019 10:19 AM

To: Kun Jiao 

Cc: Mark Adams ; 
petsc-users@mcs.anl.gov

Subject: Re: [petsc-users] [Ext] Re: error: identifier

"MatCreateMPIAIJMKL" is undefined in 3.10.4





balay@sb /home/balay/petsc (maint=)

$ git grep MatCreateMPIAIJMKL maint-3.8

maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
Creates a sparse parallel matrix whose local

maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode

MatCreateMPIAIJMKL(MPI_Comm 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Mills, Richard Tran via petsc-users
Hi Kun,

I'm the author of most of the AIJMKL stuff in PETSc. My apologies for having 
inadvertently omitted these function prototypes for these interfaces; I'm glad 
that Satish's patch has fixed this.

I want to point out that -- though I can envision some scenarios in which one 
would want to use the MatCreateXXXAIJMKL interfaces -- most of the time I would 
recommend against using these directly. Instead, I would recommend simply 
creating AIJ matrices and then setting them to the AIJMKL sub-types via the 
PETSc options database. (Via the command line, this could be done by specifying 
something like "-mat_seqaij_type seqaijmkl" to indicate that all of the 
"sequential" AIJ matrices that make up an "MPI" AIJ matrix should be of type 
SEQAIJMKL.) Because this is how I usually do things, my testing had not 
uncovered the missing function prototypes.

Best regards,
Richard



On 3/26/19 2:37 PM, Kun Jiao via petsc-users wrote:

And yes, by applying the patch in the petscmat.h, everything works.
Thanks for the help.

Regards,
Kun



Schlumberger-Private

-Original Message-
From: Balay, Satish 
Sent: Tuesday, March 26, 2019 3:42 PM
To: Kun Jiao 
Cc: petsc-users 
Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

Please apply the patch I sent earlier and retry.

Satish

On Tue, 26 Mar 2019, Kun Jiao via petsc-users wrote:



Strange things, when I compile my code in the test dir in PETSC, it works. 
After I "make install" PETSC, and try to compile my code against the installed 
PETSC, it doesn't work any more.

I guess this is what you means.

Is there any way to reenable MatCreateMPIAIJMKL public interface?

And, I am using intel MKL, here is my configure option:

Configure Options: --configModules=PETSc.Configure
--optionsModule=config.compilerOptions PETSC_ARCH=linux-gnu-intel
--with-precision=single --with-cc=mpiicc --with-cxx=mpiicc
--with-fc=mpiifort
--with-mpi-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/softwa
re/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include
--with-mpi-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/softwar
e/intel//compilers_and_libraries_2019.2.187/linux/mpi/intel64/lib
-lmpifort -lmpi_ilp64"
--with-blaslapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/
software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel6
4 -Wl, --no-as-needed -lmkl_intel_lp64 -lmkl_sequential -lmkl_core
-lpthread -lm -ldl"
--with-scalapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/s
oftware/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64"
--with-scalapack-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/
software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include
--with-mkl_pardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/so
ftware/intel/compilers_and_libraries_2019.2.187/linux/mkl
--with-mkl_sparse=1
--with-mkl_sparse-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/sof
tware/intel/compilers_and_libraries_2019.2.187/linux/mkl
--with-mkl_cpardiso=1
--with-mkl_cpardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/s
oftware/intel/compilers_and_libraries_2019.2.187/linux/mkl
--with-mkl_sparse_optimize=1
--with-mkl_sparse_optimize-dir=/wgdisk/hy3300/source_code_dev/imaging/
kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
--with-mkl_sparse_sp2m=1
--with-mkl_sparse_sp2m-dir=/wgdisk/hy3300/source_code_dev/imaging/kjia
o/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
--with-cmake=1
--prefix=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/petsc_3
.9.4 --known-endian=big --with-debugging=0 --COPTFLAGS=" -Ofast
-xHost" --CXXOPTFLAGS=" -Ofast -xHost" --FOPTFLAGS=" -Ofast -xHost"
--with-x=0 Working directory:
/wgdisk/hy3300/source_code_dev/imaging/kjiao/petsc-3.10.4



Schlumberger-Private

-Original Message-
From: Balay, Satish 
Sent: Tuesday, March 26, 2019 10:19 AM
To: Kun Jiao 
Cc: Mark Adams ; 
petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] [Ext] Re: error: identifier
"MatCreateMPIAIJMKL" is undefined in 3.10.4






balay@sb /home/balay/petsc (maint=)
$ git grep MatCreateMPIAIJMKL maint-3.8
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
Creates a sparse parallel matrix whose local
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode
MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt
M,PetscInt N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const
PetscInt o_nnz[],Mat *A)
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso:
MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
maint-3.8:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=) $ 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Kun Jiao via petsc-users
And yes, by applying the patch in the petscmat.h, everything works.
Thanks for the help.

Regards,
Kun



Schlumberger-Private

-Original Message-
From: Balay, Satish  
Sent: Tuesday, March 26, 2019 3:42 PM
To: Kun Jiao 
Cc: petsc-users 
Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

Please apply the patch I sent earlier and retry.

Satish

On Tue, 26 Mar 2019, Kun Jiao via petsc-users wrote:

> Strange things, when I compile my code in the test dir in PETSC, it works. 
> After I "make install" PETSC, and try to compile my code against the 
> installed PETSC, it doesn't work any more.
> 
> I guess this is what you means. 
> 
> Is there any way to reenable MatCreateMPIAIJMKL public interface?
> 
> And, I am using intel MKL, here is my configure option:
> 
> Configure Options: --configModules=PETSc.Configure 
> --optionsModule=config.compilerOptions PETSC_ARCH=linux-gnu-intel 
> --with-precision=single --with-cc=mpiicc --with-cxx=mpiicc 
> --with-fc=mpiifort 
> --with-mpi-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/softwa
> re/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include 
> --with-mpi-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/softwar
> e/intel//compilers_and_libraries_2019.2.187/linux/mpi/intel64/lib 
> -lmpifort -lmpi_ilp64" 
> --with-blaslapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/
> software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel6
> 4 -Wl, --no-as-needed -lmkl_intel_lp64 -lmkl_sequential -lmkl_core 
> -lpthread -lm -ldl" 
> --with-scalapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/s
> oftware/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64 
> -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" 
> --with-scalapack-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/
> software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include 
> --with-mkl_pardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/so
> ftware/intel/compilers_and_libraries_2019.2.187/linux/mkl 
> --with-mkl_sparse=1 
> --with-mkl_sparse-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/sof
> tware/intel/compilers_and_libraries_2019.2.187/linux/mkl 
> --with-mkl_cpardiso=1 
> --with-mkl_cpardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/s
> oftware/intel/compilers_and_libraries_2019.2.187/linux/mkl 
> --with-mkl_sparse_optimize=1 
> --with-mkl_sparse_optimize-dir=/wgdisk/hy3300/source_code_dev/imaging/
> kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl 
> --with-mkl_sparse_sp2m=1 
> --with-mkl_sparse_sp2m-dir=/wgdisk/hy3300/source_code_dev/imaging/kjia
> o/software/intel/compilers_and_libraries_2019.2.187/linux/mkl 
> --with-cmake=1 
> --prefix=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/petsc_3
> .9.4 --known-endian=big --with-debugging=0 --COPTFLAGS=" -Ofast 
> -xHost" --CXXOPTFLAGS=" -Ofast -xHost" --FOPTFLAGS=" -Ofast -xHost" 
> --with-x=0 Working directory: 
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/petsc-3.10.4
> 
> 
> 
> Schlumberger-Private
> 
> -Original Message-
> From: Balay, Satish 
> Sent: Tuesday, March 26, 2019 10:19 AM
> To: Kun Jiao 
> Cc: Mark Adams ; petsc-users@mcs.anl.gov
> Subject: Re: [petsc-users] [Ext] Re: error: identifier 
> "MatCreateMPIAIJMKL" is undefined in 3.10.4
> 
> >>>
> balay@sb /home/balay/petsc (maint=)
> $ git grep MatCreateMPIAIJMKL maint-3.8
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
> Creates a sparse parallel matrix whose local
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
> MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt 
> M,PetscInt N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const 
> PetscInt o_nnz[],Mat *A)
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: 
> MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
> maint-3.8:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
> MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=) $ 
> git grep MatCreateMPIAIJMKL maint
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
> Creates a sparse parallel matrix whose local
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
> MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt 
> M,PetscInt N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const 
> PetscInt o_nnz[],Mat *A)
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: 
> MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
> maint:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
> MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc 
> (maint=) $ <<<
> 
> MatCreateMPIAIJMKL() exists in both petsc-3.8 and petsc-3.10. However 
> the public interface is missing from both of these versions. So I'm 
> surprised you don't get the same error with petsc-3.8
> 
> Can you try the following change?
> 
> diff --git a/include/petscmat.h 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Kun Jiao via petsc-users
One strange thing I just found out.

Compile a *.c file make it work.
mpiicc -o ex5.o -c -fPIC -wd1572 -Ofast -xHost -I 
/NFS/home/home3/kjiao/software/petsc_3.10.4/include/ 
-I/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include
 
-I/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include
 lsqr.c

Compile a *.cpp file does not work, even though .cpp file is exactly same as .c 
file.
mpiicc -o ex5.o -c -Ofast -xHost -I 
/NFS/home/home3/kjiao/software/petsc_3.10.4/include/ 
-I/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include
 
-I/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include
 lsqr.cpp

From: Mark Adams 
Sent: Tuesday, March 26, 2019 3:38 PM
To: Kun Jiao 
Cc: petsc-users 
Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4



On Tue, Mar 26, 2019 at 3:00 PM Kun Jiao mailto:kj...@slb.com>> 
wrote:
Strange things, when I compile my code in the test dir in PETSC, it works. 
After I "make install" PETSC, and try to compile my code against the installed 
PETSC, it doesn't work any more.

I'm not sure I follow what you are doing exactly but look at the compile lines 
(good and bad) and compare them. If one works and one does not then they must 
be different.

Anyway, as Satish said this interface was not enabled in any version that we 
see. (So we are puzzled that any version works.) You can wait for a fix to get 
pushed but using the method that I showed you should work now.


I guess this is what you means.

Is there any way to reenable MatCreateMPIAIJMKL public interface?

And, I am using intel MKL, here is my configure option:

Configure Options: --configModules=PETSc.Configure 
--optionsModule=config.compilerOptions PETSC_ARCH=linux-gnu-intel 
--with-precision=single --with-cc=mpiicc --with-cxx=mpiicc --with-fc=mpiifort 
--with-mpi-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include
 
--with-mpi-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel//compilers_and_libraries_2019.2.187/linux/mpi/intel64/lib
 -lmpifort -lmpi_ilp64" 
--with-blaslapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
 -Wl, --no-as-needed -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm 
-ldl" 
--with-scalapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
 -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" 
--with-scalapack-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include
 
--with-mkl_pardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_sparse=1 
--with-mkl_sparse-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_cpardiso=1 
--with-mkl_cpardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_sparse_optimize=1 
--with-mkl_sparse_optimize-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_sparse_sp2m=1 
--with-mkl_sparse_sp2m-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-cmake=1 
--prefix=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/petsc_3.9.4 
--known-endian=big --with-debugging=0 --COPTFLAGS=" -Ofast -xHost" 
--CXXOPTFLAGS=" -Ofast -xHost" --FOPTFLAGS=" -Ofast -xHost" --with-x=0
Working directory: /wgdisk/hy3300/source_code_dev/imaging/kjiao/petsc-3.10.4



Schlumberger-Private

-Original Message-
From: Balay, Satish mailto:ba...@mcs.anl.gov>>
Sent: Tuesday, March 26, 2019 10:19 AM
To: Kun Jiao mailto:kj...@slb.com>>
Cc: Mark Adams mailto:mfad...@lbl.gov>>; 
petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

>>>
balay@sb /home/balay/petsc (maint=)
$ git grep MatCreateMPIAIJMKL maint-3.8
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
Creates a sparse parallel matrix whose local
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt 
N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat 
*A)
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: 
MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
maint-3.8:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
MatCreateMPIAIJMKL(), 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Balay, Satish via petsc-users
Please apply the patch I sent earlier and retry.

Satish

On Tue, 26 Mar 2019, Kun Jiao via petsc-users wrote:

> Strange things, when I compile my code in the test dir in PETSC, it works. 
> After I "make install" PETSC, and try to compile my code against the 
> installed PETSC, it doesn't work any more.
> 
> I guess this is what you means. 
> 
> Is there any way to reenable MatCreateMPIAIJMKL public interface?
> 
> And, I am using intel MKL, here is my configure option:
> 
> Configure Options: --configModules=PETSc.Configure 
> --optionsModule=config.compilerOptions PETSC_ARCH=linux-gnu-intel 
> --with-precision=single --with-cc=mpiicc --with-cxx=mpiicc --with-fc=mpiifort 
> --with-mpi-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include
>  
> --with-mpi-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel//compilers_and_libraries_2019.2.187/linux/mpi/intel64/lib
>  -lmpifort -lmpi_ilp64" 
> --with-blaslapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
>  -Wl, --no-as-needed -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread 
> -lm -ldl" 
> --with-scalapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
>  -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" 
> --with-scalapack-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include
>  
> --with-mkl_pardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
>  --with-mkl_sparse=1 
> --with-mkl_sparse-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
>  --with-mkl_cpardiso=1 
> --with-mkl_cpardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
>  --with-mkl_sparse_optimize=1 
> --with-mkl_sparse_optimize-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
>  --with-mkl_sparse_sp2m=1 
> --with-mkl_sparse_sp2m-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
>  --with-cmake=1 
> --prefix=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/petsc_3.9.4 
> --known-endian=big --with-debugging=0 --COPTFLAGS=" -Ofast -xHost" 
> --CXXOPTFLAGS=" -Ofast -xHost" --FOPTFLAGS=" -Ofast -xHost" --with-x=0
> Working directory: /wgdisk/hy3300/source_code_dev/imaging/kjiao/petsc-3.10.4
> 
> 
> 
> Schlumberger-Private
> 
> -Original Message-
> From: Balay, Satish  
> Sent: Tuesday, March 26, 2019 10:19 AM
> To: Kun Jiao 
> Cc: Mark Adams ; petsc-users@mcs.anl.gov
> Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" 
> is undefined in 3.10.4
> 
> >>>
> balay@sb /home/balay/petsc (maint=)
> $ git grep MatCreateMPIAIJMKL maint-3.8
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
> Creates a sparse parallel matrix whose local
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
> MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt 
> N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt 
> o_nnz[],Mat *A)
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: 
> MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
> maint-3.8:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
> MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=) $ 
> git grep MatCreateMPIAIJMKL maint
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
> Creates a sparse parallel matrix whose local
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
> MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt 
> N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt 
> o_nnz[],Mat *A)
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: 
> MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
> maint:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
> MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=) $ 
> <<<
> 
> MatCreateMPIAIJMKL() exists in both petsc-3.8 and petsc-3.10. However the 
> public interface is missing from both of these versions. So I'm surprised you 
> don't get the same error with petsc-3.8
> 
> Can you try the following change?
> 
> diff --git a/include/petscmat.h b/include/petscmat.h index 
> 1b8ac69377..c66f727994 100644
> --- a/include/petscmat.h
> +++ b/include/petscmat.h
> @@ -223,7 +223,8 @@ typedef enum 
> {DIFFERENT_NONZERO_PATTERN,SUBSET_NONZERO_PATTERN,SAME_NONZERO_PATT
>  
>  #if defined PETSC_HAVE_MKL_SPARSE
>  PETSC_EXTERN PetscErrorCode 
> 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Mark Adams via petsc-users
On Tue, Mar 26, 2019 at 3:00 PM Kun Jiao  wrote:

> Strange things, when I compile my code in the test dir in PETSC, it works.
> After I "make install" PETSC, and try to compile my code against the
> installed PETSC, it doesn't work any more.
>

I'm not sure I follow what you are doing exactly but look at the compile
lines (good and bad) and compare them. If one works and one does not then
they must be different.

Anyway, as Satish said this interface was not enabled in any version that
we see. (So we are puzzled that any version works.) You can wait for a fix
to get pushed but using the method that I showed you should work now.


>
> I guess this is what you means.
>
> Is there any way to reenable MatCreateMPIAIJMKL public interface?
>
> And, I am using intel MKL, here is my configure option:
>
> Configure Options: --configModules=PETSc.Configure
> --optionsModule=config.compilerOptions PETSC_ARCH=linux-gnu-intel
> --with-precision=single --with-cc=mpiicc --with-cxx=mpiicc
> --with-fc=mpiifort
> --with-mpi-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include
> --with-mpi-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel//compilers_and_libraries_2019.2.187/linux/mpi/intel64/lib
> -lmpifort -lmpi_ilp64"
> --with-blaslapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
> -Wl, --no-as-needed -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread
> -lm -ldl"
> --with-scalapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
> -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64"
> --with-scalapack-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include
> --with-mkl_pardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
> --with-mkl_sparse=1
> --with-mkl_sparse-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
> --with-mkl_cpardiso=1
> --with-mkl_cpardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
> --with-mkl_sparse_optimize=1
> --with-mkl_sparse_optimize-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
> --with-mkl_sparse_sp2m=1
> --with-mkl_sparse_sp2m-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
> --with-cmake=1
> --prefix=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/petsc_3.9.4
> --known-endian=big --with-debugging=0 --COPTFLAGS=" -Ofast -xHost"
> --CXXOPTFLAGS=" -Ofast -xHost" --FOPTFLAGS=" -Ofast -xHost" --with-x=0
> Working directory:
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/petsc-3.10.4
>
>
>
> Schlumberger-Private
>
> -Original Message-
> From: Balay, Satish 
> Sent: Tuesday, March 26, 2019 10:19 AM
> To: Kun Jiao 
> Cc: Mark Adams ; petsc-users@mcs.anl.gov
> Subject: Re: [petsc-users] [Ext] Re: error: identifier
> "MatCreateMPIAIJMKL" is undefined in 3.10.4
>
> >>>
> balay@sb /home/balay/petsc (maint=)
> $ git grep MatCreateMPIAIJMKL maint-3.8
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL -
> Creates a sparse parallel matrix whose local
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode
> MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt
> N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt
> o_nnz[],Mat *A)
> maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso:
> MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
> maint-3.8:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(),
> MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=)
> $ git grep MatCreateMPIAIJMKL maint
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL -
> Creates a sparse parallel matrix whose local
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode
> MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt
> N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt
> o_nnz[],Mat *A)
> maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso:
> MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
> maint:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(),
> MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=)
> $ <<<
>
> MatCreateMPIAIJMKL() exists in both petsc-3.8 and petsc-3.10. However the
> public interface is missing from both of these versions. So I'm surprised
> you don't get the same error with petsc-3.8
>
> Can you try the following change?
>
> diff --git a/include/petscmat.h b/include/petscmat.h index
> 1b8ac69377..c66f727994 100644
> --- a/include/petscmat.h
> +++ 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Kun Jiao via petsc-users
Strange things, when I compile my code in the test dir in PETSC, it works. 
After I "make install" PETSC, and try to compile my code against the installed 
PETSC, it doesn't work any more.

I guess this is what you means. 

Is there any way to reenable MatCreateMPIAIJMKL public interface?

And, I am using intel MKL, here is my configure option:

Configure Options: --configModules=PETSc.Configure 
--optionsModule=config.compilerOptions PETSC_ARCH=linux-gnu-intel 
--with-precision=single --with-cc=mpiicc --with-cxx=mpiicc --with-fc=mpiifort 
--with-mpi-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mpi/intel64/include
 
--with-mpi-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel//compilers_and_libraries_2019.2.187/linux/mpi/intel64/lib
 -lmpifort -lmpi_ilp64" 
--with-blaslapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
 -Wl, --no-as-needed -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm 
-ldl" 
--with-scalapack-lib="-L/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/lib/intel64
 -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" 
--with-scalapack-include=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl/include
 
--with-mkl_pardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_sparse=1 
--with-mkl_sparse-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_cpardiso=1 
--with-mkl_cpardiso-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_sparse_optimize=1 
--with-mkl_sparse_optimize-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-mkl_sparse_sp2m=1 
--with-mkl_sparse_sp2m-dir=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/intel/compilers_and_libraries_2019.2.187/linux/mkl
 --with-cmake=1 
--prefix=/wgdisk/hy3300/source_code_dev/imaging/kjiao/software/petsc_3.9.4 
--known-endian=big --with-debugging=0 --COPTFLAGS=" -Ofast -xHost" 
--CXXOPTFLAGS=" -Ofast -xHost" --FOPTFLAGS=" -Ofast -xHost" --with-x=0
Working directory: /wgdisk/hy3300/source_code_dev/imaging/kjiao/petsc-3.10.4



Schlumberger-Private

-Original Message-
From: Balay, Satish  
Sent: Tuesday, March 26, 2019 10:19 AM
To: Kun Jiao 
Cc: Mark Adams ; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

>>>
balay@sb /home/balay/petsc (maint=)
$ git grep MatCreateMPIAIJMKL maint-3.8
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
Creates a sparse parallel matrix whose local
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt 
N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat 
*A)
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: 
MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
maint-3.8:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=) $ git 
grep MatCreateMPIAIJMKL maint
maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - Creates 
a sparse parallel matrix whose local
maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt 
N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat 
*A)
maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: MatCreateMPIAIJMKL(), 
MATSEQAIJMKL, MATMPIAIJMKL
maint:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
MatCreateMPIAIJMKL(), MatSetValues() balay@sb /home/balay/petsc (maint=) $ 
<<<

MatCreateMPIAIJMKL() exists in both petsc-3.8 and petsc-3.10. However the 
public interface is missing from both of these versions. So I'm surprised you 
don't get the same error with petsc-3.8

Can you try the following change?

diff --git a/include/petscmat.h b/include/petscmat.h index 
1b8ac69377..c66f727994 100644
--- a/include/petscmat.h
+++ b/include/petscmat.h
@@ -223,7 +223,8 @@ typedef enum 
{DIFFERENT_NONZERO_PATTERN,SUBSET_NONZERO_PATTERN,SAME_NONZERO_PATT
 
 #if defined PETSC_HAVE_MKL_SPARSE
 PETSC_EXTERN PetscErrorCode 
MatCreateBAIJMKL(MPI_Comm,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,const
 PetscInt[],PetscInt,const PetscInt[],Mat*); -PETSC_EXTERN PetscErrorCode 
MatCreateSeqBAIJMKL(MPI_Comm comm,PetscInt bs,PetscInt m,PetscInt n,PetscInt 
nz,const PetscInt nnz[],Mat *A);
+PETSC_EXTERN PetscErrorCode 
+MatCreateSeqBAIJMKL(MPI_Comm,PetscInt,PetscInt,PetscInt,PetscInt,const 

Re: [petsc-users] Confusing Schur preconditioner behaviour

2019-03-26 Thread Cotter, Colin J via petsc-users
Hi Dave,

  Thanks for the tip - you were right, and this works better for higher 
resolutions now.


all the best

--Colin


From: Dave May 
Sent: 19 March 2019 11:25:11
To: Cotter, Colin J
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Confusing Schur preconditioner behaviour


Hi Colin,

On Tue, 19 Mar 2019 at 09:33, Cotter, Colin J 
mailto:colin.cot...@imperial.ac.uk>> wrote:

Hi Dave,

>If you are doing that, then you need to tell fieldsplit to use the Amat to 
>define the splits otherwise it will define the Schur compliment as
>S = B22 - B21 inv(B11) B12
>preconditiones with B22, where as what you want is
>S = A22 - A21 inv(A11) A12
>preconditioned with B22.

>If your operators are set up this way and you didn't indicate to use Amat to 
>define S this would definitely explain why preonly works but iterating on 
>Schur does not.

Yes, thanks - this solves it! I need pc_use_amat.

Okay great. But doesn't that option eradicate your custom Schur complement 
object which you inserted into the Bmat in the (2,2) slot?

I thought you would use the option
-pc_fieldsplit_diag_use_amat

In general for fieldsplit (Schur) I found that the best way to manage user 
defined Schur complement preconditioners is via PCFieldSplitSetSchurPre().

https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre

Also, for solver debugging purposes with fieldsplit and MatNest, I find it 
incredibly useful to attach textual names to all the matrices going to into 
FieldSplit. You can use PetscObjectSetName() with each of your sub-matrices in 
the Amat and the Bmat, and any schur complement operators. The textual names 
will be displayed in KSP view. In that way you have a better chance of 
understanding which operators are being used where. (Note that this trick is 
less useful with the Amat and Bmat are AIJ matrices).

Below is an example KSPView associated with 2x2 block system where I've 
attached the names Auu,Aup,Apu,App, and S* to the Amat sub-matices and the 
schur complement preconditioner.


PC Object:(dcy_) 1 MPI processes

  type: fieldsplit

FieldSplit with Schur preconditioner, factorization FULL

Preconditioner for the Schur complement formed from Sp, an assembled 
approximation to S, which uses (lumped, if requested) A00's diagonal's inverse

Split info:

Split number 0 Defined by IS

Split number 1 Defined by IS

KSP solver for A00 block

  KSP Object:  (dcy_fieldsplit_u_)   1 MPI processes

type: preonly

maximum iterations=1, initial guess is zero

tolerances:  relative=1e-05, absolute=1e-50, divergence=1.

left preconditioning

using NONE norm type for convergence test

  PC Object:  (dcy_fieldsplit_u_)   1 MPI processes

type: lu

  LU: out-of-place factorization

  tolerance for zero pivot 2.22045e-14

  matrix ordering: nd

  factor fill ratio given 0., needed 0.

Factored matrix follows:

  Mat Object:   1 MPI processes

type: seqaij

rows=85728, cols=85728

package used to perform factorization: umfpack

total: nonzeros=0, allocated nonzeros=0

total number of mallocs used during MatSetValues calls =0

  not using I-node routines

  UMFPACK run parameters:

Control[UMFPACK_PRL]: 1.

Control[UMFPACK_STRATEGY]: 0.

Control[UMFPACK_DENSE_COL]: 0.2

Control[UMFPACK_DENSE_ROW]: 0.2

Control[UMFPACK_AMD_DENSE]: 10.

Control[UMFPACK_BLOCK_SIZE]: 32.

Control[UMFPACK_FIXQ]: 0.

Control[UMFPACK_AGGRESSIVE]: 1.

Control[UMFPACK_PIVOT_TOLERANCE]: 0.1

Control[UMFPACK_SYM_PIVOT_TOLERANCE]: 0.001

Control[UMFPACK_SCALE]: 1.

Control[UMFPACK_ALLOC_INIT]: 0.7

Control[UMFPACK_DROPTOL]: 0.

Control[UMFPACK_IRSTEP]: 0.

Control[UMFPACK_ORDERING]: AMD (not using the PETSc 
ordering)

linear system matrix = precond matrix:

Mat Object:Auu(dcy_fieldsplit_u_) 1 MPI 
processes

  type: seqaij

  rows=85728, cols=85728

  total: nonzeros=1028736, allocated nonzeros=1028736

  total number of mallocs used during MatSetValues calls =0

using I-node routines: found 21432 nodes, limit used is 5

KSP solver for S = A11 - A10 inv(A00) A01

  KSP Object:  (dcy_fieldsplit_p_)   1 MPI processes

type: fgmres

  GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
Orthogonalization with no iterative refinement

  GMRES: happy breakdown 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Mark Adams via petsc-users
So this works with v3.8? I don't see any differences (I see Satish figured
this out and has suggestions).

You could also work around it with code like this:

 ierr = MatCreate(PETSC_COMM_WORLD,);CHKERRQ(ierr);
 ierr = MatSetType(A,MATAIJMKL);CHKERRQ(ierr);
 ierr = MatMPIAIJSetPreallocation(A,0,ourlens,0,offlens);

On Tue, Mar 26, 2019 at 11:00 AM Kun Jiao  wrote:

> [kjiao@hyi0016 src/lsqr]% make
>
> [ 50%] Building CXX object lsqr/CMakeFiles/p_lsqr.dir/lsqr.cc.o
>
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc(318):
> error: identifier "MatCreateMPIAIJMKL" is undefined
>
> ierr =
> MatCreateMPIAIJMKL(comm,m,n,M,N,maxnz,dialens,maxnz,offlens,);CHKERRQ(ierr);
>
>^
>
>
>
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc(578):
> error: identifier "MatCreateMPIAIJMKL" is undefined
>
> ierr =
> MatCreateMPIAIJMKL(comm,m,n,M,N,maxnz,dialens,maxnz,offlens,);CHKERRQ(ierr);
>
>^
>
>
>
> compilation aborted for
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc (code
> 2)
>
>
>
> Thanks.
>
>
>
>
>
> *From:* Mark Adams 
> *Sent:* Tuesday, March 26, 2019 9:22 AM
> *To:* Kun Jiao 
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [Ext] Re: [petsc-users] error: identifier
> "MatCreateMPIAIJMKL" is undefined in 3.10.4
>
>
>
> I assume the whole error message will have the line of code. Please send
> the whole error message and line of offending code if not included.
>
>
>
> On Tue, Mar 26, 2019 at 10:08 AM Kun Jiao  wrote:
>
> It is compiling error, error message is:
>
>
>
> error: identifier "MatCreateMPIAIJMKL" is undefined.
>
>
>
>
>
>
>
>
>
>
>
> *From:* Mark Adams 
> *Sent:* Tuesday, March 26, 2019 6:48 AM
> *To:* Kun Jiao 
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* [Ext] Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL"
> is undefined in 3.10.4
>
>
>
> Please send the output of the error (runtime, compile time, link time?)
>
>
>
> On Mon, Mar 25, 2019 at 10:50 PM Kun Jiao via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
> Hi Petsc Experts,
>
>
>
> Is MatCreateMPIAIJMKL retired in 3.10.4?
>
>
>
> I got this error with my code which works fine in 3.8.3 version.
>
>
>
> Regards,
>
> Kun
>
>
>
>
>
> Schlumberger-Private
>
>
>
> Schlumberger-Private
>
>
>
> Schlumberger-Private
>
>


Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Balay, Satish via petsc-users
>>>
balay@sb /home/balay/petsc (maint=)
$ git grep MatCreateMPIAIJMKL maint-3.8
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - 
Creates a sparse parallel matrix whose local
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt 
N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat 
*A)
maint-3.8:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: 
MatCreateMPIAIJMKL(), MATSEQAIJMKL, MATMPIAIJMKL
maint-3.8:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
MatCreateMPIAIJMKL(), MatSetValues()
balay@sb /home/balay/petsc (maint=)
$ git grep MatCreateMPIAIJMKL maint
maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:   MatCreateMPIAIJMKL - Creates 
a sparse parallel matrix whose local
maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:PetscErrorCode  
MatCreateMPIAIJMKL(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt 
N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat 
*A)
maint:src/mat/impls/aij/mpi/aijmkl/mpiaijmkl.c:.seealso: MatCreateMPIAIJMKL(), 
MATSEQAIJMKL, MATMPIAIJMKL
maint:src/mat/impls/aij/seq/aijmkl/aijmkl.c:.seealso: MatCreate(), 
MatCreateMPIAIJMKL(), MatSetValues()
balay@sb /home/balay/petsc (maint=)
$ 
<<<

MatCreateMPIAIJMKL() exists in both petsc-3.8 and petsc-3.10. However
the public interface is missing from both of these versions. So I'm
surprised you don't get the same error with petsc-3.8

Can you try the following change?

diff --git a/include/petscmat.h b/include/petscmat.h
index 1b8ac69377..c66f727994 100644
--- a/include/petscmat.h
+++ b/include/petscmat.h
@@ -223,7 +223,8 @@ typedef enum 
{DIFFERENT_NONZERO_PATTERN,SUBSET_NONZERO_PATTERN,SAME_NONZERO_PATT
 
 #if defined PETSC_HAVE_MKL_SPARSE
 PETSC_EXTERN PetscErrorCode 
MatCreateBAIJMKL(MPI_Comm,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,const
 PetscInt[],PetscInt,const PetscInt[],Mat*);
-PETSC_EXTERN PetscErrorCode MatCreateSeqBAIJMKL(MPI_Comm comm,PetscInt 
bs,PetscInt m,PetscInt n,PetscInt nz,const PetscInt nnz[],Mat *A);
+PETSC_EXTERN PetscErrorCode 
MatCreateSeqBAIJMKL(MPI_Comm,PetscInt,PetscInt,PetscInt,PetscInt,const 
PetscInt[],Mat*);
+PETSC_EXTERN PetscErrorCode  
MatCreateMPIAIJMKL(MPI_Comm,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,const 
PetscInt[],PetscInt,const PetscInt[],Mat*);
 #endif
 
 PETSC_EXTERN PetscErrorCode 
MatCreateSeqSELL(MPI_Comm,PetscInt,PetscInt,PetscInt,const PetscInt[],Mat*);


Also note: - this routine is available only when PETSc is built with Intel MKL

Satish

On Tue, 26 Mar 2019, Kun Jiao via petsc-users wrote:

> [kjiao@hyi0016 src/lsqr]% make
> [ 50%] Building CXX object lsqr/CMakeFiles/p_lsqr.dir/lsqr.cc.o
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc(318): 
> error: identifier "MatCreateMPIAIJMKL" is undefined
> ierr = 
> MatCreateMPIAIJMKL(comm,m,n,M,N,maxnz,dialens,maxnz,offlens,);CHKERRQ(ierr);
>^
> 
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc(578): 
> error: identifier "MatCreateMPIAIJMKL" is undefined
> ierr = 
> MatCreateMPIAIJMKL(comm,m,n,M,N,maxnz,dialens,maxnz,offlens,);CHKERRQ(ierr);
>^
> 
> compilation aborted for 
> /wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc (code 2)
> 
> Thanks.
> 
> 
> From: Mark Adams 
> Sent: Tuesday, March 26, 2019 9:22 AM
> To: Kun Jiao 
> Cc: petsc-users@mcs.anl.gov
> Subject: Re: [Ext] Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL" 
> is undefined in 3.10.4
> 
> I assume the whole error message will have the line of code. Please send the 
> whole error message and line of offending code if not included.
> 
> On Tue, Mar 26, 2019 at 10:08 AM Kun Jiao 
> mailto:kj...@slb.com>> wrote:
> It is compiling error, error message is:
> 
> error: identifier "MatCreateMPIAIJMKL" is undefined.
> 
> 
> 
> 
> 
> From: Mark Adams mailto:mfad...@lbl.gov>>
> Sent: Tuesday, March 26, 2019 6:48 AM
> To: Kun Jiao mailto:kj...@slb.com>>
> Cc: petsc-users@mcs.anl.gov
> Subject: [Ext] Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL" is 
> undefined in 3.10.4
> 
> Please send the output of the error (runtime, compile time, link time?)
> 
> On Mon, Mar 25, 2019 at 10:50 PM Kun Jiao via petsc-users 
> mailto:petsc-users@mcs.anl.gov>> wrote:
> Hi Petsc Experts,
> 
> Is MatCreateMPIAIJMKL retired in 3.10.4?
> 
> I got this error with my code which works fine in 3.8.3 version.
> 
> Regards,
> Kun
> 
> 
> 
> Schlumberger-Private
> 
> 
> Schlumberger-Private
> 
> 
> Schlumberger-Private
> 



Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Myriam Peyrounette via petsc-users
*SetFromOptions() was not called indeed... Thanks! The code performance
is better now with regard to memory usage!

I still have to plot the memory scaling on bigger cases to see if it has
the same good behaviour as when using the 3.6 version.

I'll let ou know as soon as I have plotted it.

Thanks again

Myriam


Le 03/26/19 à 14:30, Matthew Knepley a écrit :
> On Tue, Mar 26, 2019 at 9:27 AM Myriam Peyrounette
> mailto:myriam.peyroune...@idris.fr>> wrote:
>
> I checked with -ksp_view (attached) but no prefix is associated
> with the matrix. Some are associated to the KSP and PC, but none
> to the Mat
>
> Another thing that could prevent options being used is that
> *SetFromOptions() is not called for the object.
>
>   Thanks,
>
>      Matt
>  
>
> Le 03/26/19 à 11:55, Dave May a écrit :
>>
>>
>> On Tue, 26 Mar 2019 at 10:36, Myriam Peyrounette
>> > > wrote:
>>
>> Oh you were right, the three options are unsused
>> (-matptap_via scalable, -inner_offdiag_matmatmult_via
>> scalable and -inner_diag_matmatmult_via scalable). Does this
>> mean I am not using the associated PtAP functions?
>>
>>
>> No - not necessarily. All it means is the options were not parsed. 
>>
>> If your matrices have an option prefix associated with them (e.g.
>> abc) , then you need to provide the option as
>>   -abc_matptap_via scalable
>>
>> If you are not sure if you matrices have a prefix, look at the
>> result of -ksp_view (see below for an example)
>>
>>   Mat Object: 2 MPI processes
>>
>>     type: mpiaij
>>
>>     rows=363, cols=363, bs=3
>>
>>     total: nonzeros=8649, allocated nonzeros=8649
>>
>>     total number of mallocs used during MatSetValues calls =0
>>
>>   Mat Object: (B_) 2 MPI processes
>>
>>     type: mpiaij
>>
>>     rows=363, cols=363, bs=3
>>
>>     total: nonzeros=8649, allocated nonzeros=8649
>>
>>     total number of mallocs used during MatSetValues calls =0
>>
>>
>> The first matrix has no options prefix, but the second does and
>> it's called "B_".
>>
>>
>>
>>  
>>
>> Myriam
>>
>>
>> Le 03/26/19 à 11:10, Dave May a écrit :
>>>
>>> On Tue, 26 Mar 2019 at 09:52, Myriam Peyrounette via
>>> petsc-users >> > wrote:
>>>
>>> How can I be sure they are indeed used? Can I print this
>>> information in some log file?
>>>
>>> Yes. Re-run the job with the command line option
>>>
>>> -options_left true
>>>
>>> This will report all options parsed, and importantly, will
>>> also indicate if any options were unused.
>>>  
>>>
>>> Thanks
>>> Dave
>>>
>>> Thanks in advance
>>>
>>> Myriam
>>>
>>>
>>> Le 03/25/19 à 18:24, Matthew Knepley a écrit :
 On Mon, Mar 25, 2019 at 10:54 AM Myriam Peyrounette via
 petsc-users >>> > wrote:

 Hi,

 thanks for the explanations. I tried the last PETSc
 version (commit
 fbc5705bc518d02a4999f188aad4ccff5f754cbf), which
 includes the patch you talked about. But the memory
 scaling shows no improvement (see scaling
 attached), even when using the "scalable" options :(

 I had a look at the PETSc functions
 MatPtAPNumeric_MPIAIJ_MPIAIJ and
 MatPtAPSymbolic_MPIAIJ_MPIAIJ (especially at the
 differences before and after the first "bad"
 commit), but I can't find what induced this memory
 issue.

 Are you sure that the option was used? It just looks
 suspicious to me that they use exactly the same amount
 of memory. It should be different, even if it does not
 solve the problem.

    Thanks,

      Matt 

 Myriam




 Le 03/20/19 à 17:38, Fande Kong a écrit :
> Hi Myriam,
>
> There are three algorithms in PETSc to do PtAP
> ( const char          *algTypes[3] =
> {"scalable","nonscalable","hypre"};), and can be
> specified using the petsc options: -matptap_via .
>
> (1) -matptap_via hypre: This call the hypre
> package to do the PtAP trough an all-at-once
> triple product. In our experiences, it is the most
> memory efficient, but could be slow.
>
> (2)  -matptap_via scalable: This involves a
> row-wise 

Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Kun Jiao via petsc-users
[kjiao@hyi0016 src/lsqr]% make
[ 50%] Building CXX object lsqr/CMakeFiles/p_lsqr.dir/lsqr.cc.o
/wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc(318): 
error: identifier "MatCreateMPIAIJMKL" is undefined
ierr = 
MatCreateMPIAIJMKL(comm,m,n,M,N,maxnz,dialens,maxnz,offlens,);CHKERRQ(ierr);
   ^

/wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc(578): 
error: identifier "MatCreateMPIAIJMKL" is undefined
ierr = 
MatCreateMPIAIJMKL(comm,m,n,M,N,maxnz,dialens,maxnz,offlens,);CHKERRQ(ierr);
   ^

compilation aborted for 
/wgdisk/hy3300/source_code_dev/imaging/kjiao/src/git/src/lsqr/lsqr.cc (code 2)

Thanks.


From: Mark Adams 
Sent: Tuesday, March 26, 2019 9:22 AM
To: Kun Jiao 
Cc: petsc-users@mcs.anl.gov
Subject: Re: [Ext] Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

I assume the whole error message will have the line of code. Please send the 
whole error message and line of offending code if not included.

On Tue, Mar 26, 2019 at 10:08 AM Kun Jiao mailto:kj...@slb.com>> 
wrote:
It is compiling error, error message is:

error: identifier "MatCreateMPIAIJMKL" is undefined.





From: Mark Adams mailto:mfad...@lbl.gov>>
Sent: Tuesday, March 26, 2019 6:48 AM
To: Kun Jiao mailto:kj...@slb.com>>
Cc: petsc-users@mcs.anl.gov
Subject: [Ext] Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

Please send the output of the error (runtime, compile time, link time?)

On Mon, Mar 25, 2019 at 10:50 PM Kun Jiao via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hi Petsc Experts,

Is MatCreateMPIAIJMKL retired in 3.10.4?

I got this error with my code which works fine in 3.8.3 version.

Regards,
Kun



Schlumberger-Private


Schlumberger-Private


Schlumberger-Private


Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Mark Adams via petsc-users
I assume the whole error message will have the line of code. Please send
the whole error message and line of offending code if not included.

On Tue, Mar 26, 2019 at 10:08 AM Kun Jiao  wrote:

> It is compiling error, error message is:
>
>
>
> error: identifier "MatCreateMPIAIJMKL" is undefined.
>
>
>
>
>
>
>
>
>
>
>
> *From:* Mark Adams 
> *Sent:* Tuesday, March 26, 2019 6:48 AM
> *To:* Kun Jiao 
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* [Ext] Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL"
> is undefined in 3.10.4
>
>
>
> Please send the output of the error (runtime, compile time, link time?)
>
>
>
> On Mon, Mar 25, 2019 at 10:50 PM Kun Jiao via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
> Hi Petsc Experts,
>
>
>
> Is MatCreateMPIAIJMKL retired in 3.10.4?
>
>
>
> I got this error with my code which works fine in 3.8.3 version.
>
>
>
> Regards,
>
> Kun
>
>
>
>
>
> Schlumberger-Private
>
>
>
> Schlumberger-Private
>
>


Re: [petsc-users] [Ext] Re: error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Kun Jiao via petsc-users
It is compiling error, error message is:

error: identifier "MatCreateMPIAIJMKL" is undefined.





From: Mark Adams 
Sent: Tuesday, March 26, 2019 6:48 AM
To: Kun Jiao 
Cc: petsc-users@mcs.anl.gov
Subject: [Ext] Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL" is 
undefined in 3.10.4

Please send the output of the error (runtime, compile time, link time?)

On Mon, Mar 25, 2019 at 10:50 PM Kun Jiao via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hi Petsc Experts,

Is MatCreateMPIAIJMKL retired in 3.10.4?

I got this error with my code which works fine in 3.8.3 version.

Regards,
Kun



Schlumberger-Private


Schlumberger-Private


Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Matthew Knepley via petsc-users
On Tue, Mar 26, 2019 at 9:27 AM Myriam Peyrounette <
myriam.peyroune...@idris.fr> wrote:

> I checked with -ksp_view (attached) but no prefix is associated with the
> matrix. Some are associated to the KSP and PC, but none to the Mat
>
Another thing that could prevent options being used is that
*SetFromOptions() is not called for the object.

  Thanks,

 Matt


> Le 03/26/19 à 11:55, Dave May a écrit :
>
>
>
> On Tue, 26 Mar 2019 at 10:36, Myriam Peyrounette <
> myriam.peyroune...@idris.fr> wrote:
>
>> Oh you were right, the three options are unsused (-matptap_via scalable,
>> -inner_offdiag_matmatmult_via scalable and -inner_diag_matmatmult_via
>> scalable). Does this mean I am not using the associated PtAP functions?
>>
>
> No - not necessarily. All it means is the options were not parsed.
>
> If your matrices have an option prefix associated with them (e.g. abc) ,
> then you need to provide the option as
>   -abc_matptap_via scalable
>
> If you are not sure if you matrices have a prefix, look at the result of
> -ksp_view (see below for an example)
>
>   Mat Object: 2 MPI processes
>
> type: mpiaij
>
> rows=363, cols=363, bs=3
>
> total: nonzeros=8649, allocated nonzeros=8649
>
> total number of mallocs used during MatSetValues calls =0
>
>   Mat Object: (B_) 2 MPI processes
>
> type: mpiaij
>
> rows=363, cols=363, bs=3
>
> total: nonzeros=8649, allocated nonzeros=8649
>
> total number of mallocs used during MatSetValues calls =0
>
> The first matrix has no options prefix, but the second does and it's
> called "B_".
>
>
>
>
>
>> Myriam
>>
>> Le 03/26/19 à 11:10, Dave May a écrit :
>>
>>
>> On Tue, 26 Mar 2019 at 09:52, Myriam Peyrounette via petsc-users <
>> petsc-users@mcs.anl.gov> wrote:
>>
>>> How can I be sure they are indeed used? Can I print this information in
>>> some log file?
>>>
>> Yes. Re-run the job with the command line option
>>
>> -options_left true
>>
>> This will report all options parsed, and importantly, will also indicate
>> if any options were unused.
>>
>>
>> Thanks
>> Dave
>>
>> Thanks in advance
>>>
>>> Myriam
>>>
>>> Le 03/25/19 à 18:24, Matthew Knepley a écrit :
>>>
>>> On Mon, Mar 25, 2019 at 10:54 AM Myriam Peyrounette via petsc-users <
>>> petsc-users@mcs.anl.gov> wrote:
>>>
 Hi,

 thanks for the explanations. I tried the last PETSc version (commit
 fbc5705bc518d02a4999f188aad4ccff5f754cbf), which includes the patch you
 talked about. But the memory scaling shows no improvement (see scaling
 attached), even when using the "scalable" options :(

 I had a look at the PETSc functions MatPtAPNumeric_MPIAIJ_MPIAIJ and
 MatPtAPSymbolic_MPIAIJ_MPIAIJ (especially at the differences before and
 after the first "bad" commit), but I can't find what induced this memory
 issue.

>>> Are you sure that the option was used? It just looks suspicious to me
>>> that they use exactly the same amount of memory. It should be different,
>>> even if it does not solve the problem.
>>>
>>>Thanks,
>>>
>>>  Matt
>>>
 Myriam




 Le 03/20/19 à 17:38, Fande Kong a écrit :

 Hi Myriam,

 There are three algorithms in PETSc to do PtAP ( const char
 *algTypes[3] = {"scalable","nonscalable","hypre"};), and can be specified
 using the petsc options: -matptap_via .

 (1) -matptap_via hypre: This call the hypre package to do the PtAP
 trough an all-at-once triple product. In our experiences, it is the most
 memory efficient, but could be slow.

 (2)  -matptap_via scalable: This involves a row-wise algorithm plus an
 outer product.  This will use more memory than hypre, but way faster. This
 used to have a bug that could take all your memory, and I have a fix at
 https://bitbucket.org/petsc/petsc/pull-requests/1452/mpiptap-enable-large-scale-simulations/diff.
 When using this option, we may want to have extra options such as
  -inner_offdiag_matmatmult_via scalable -inner_diag_matmatmult_via
 scalable  to select inner scalable algorithms.

 (3)  -matptap_via nonscalable:  Suppose to be even faster, but use more
 memory. It does dense matrix operations.


 Thanks,

 Fande Kong




 On Wed, Mar 20, 2019 at 10:06 AM Myriam Peyrounette via petsc-users <
 petsc-users@mcs.anl.gov> wrote:

> More precisely: something happens when upgrading the functions
> MatPtAPNumeric_MPIAIJ_MPIAIJ and/or MatPtAPSymbolic_MPIAIJ_MPIAIJ.
>
> Unfortunately, there are a lot of differences between the old and new
> versions of these functions. I keep investigating but if you have any 
> idea,
> please let me know.
>
> Best,
>
> Myriam
>
> Le 03/20/19 à 13:48, Myriam Peyrounette a écrit :
>
> Hi all,
>
> I used git bisect to determine when the memory need increased. I found
> that the first "bad" commit is   

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Myriam Peyrounette via petsc-users
I checked with -ksp_view (attached) but no prefix is associated with the
matrix. Some are associated to the KSP and PC, but none to the Mat.


Le 03/26/19 à 11:55, Dave May a écrit :
>
>
> On Tue, 26 Mar 2019 at 10:36, Myriam Peyrounette
> mailto:myriam.peyroune...@idris.fr>> wrote:
>
> Oh you were right, the three options are unsused (-matptap_via
> scalable, -inner_offdiag_matmatmult_via scalable and
> -inner_diag_matmatmult_via scalable). Does this mean I am not
> using the associated PtAP functions?
>
>
> No - not necessarily. All it means is the options were not parsed. 
>
> If your matrices have an option prefix associated with them (e.g. abc)
> , then you need to provide the option as
>   -abc_matptap_via scalable
>
> If you are not sure if you matrices have a prefix, look at the result
> of -ksp_view (see below for an example)
>
>   Mat Object: 2 MPI processes
>
>     type: mpiaij
>
>     rows=363, cols=363, bs=3
>
>     total: nonzeros=8649, allocated nonzeros=8649
>
>     total number of mallocs used during MatSetValues calls =0
>
>   Mat Object: (B_) 2 MPI processes
>
>     type: mpiaij
>
>     rows=363, cols=363, bs=3
>
>     total: nonzeros=8649, allocated nonzeros=8649
>
>     total number of mallocs used during MatSetValues calls =0
>
>
> The first matrix has no options prefix, but the second does and it's
> called "B_".
>
>
>
>  
>
> Myriam
>
>
> Le 03/26/19 à 11:10, Dave May a écrit :
>>
>> On Tue, 26 Mar 2019 at 09:52, Myriam Peyrounette via petsc-users
>> mailto:petsc-users@mcs.anl.gov>> wrote:
>>
>> How can I be sure they are indeed used? Can I print this
>> information in some log file?
>>
>> Yes. Re-run the job with the command line option
>>
>> -options_left true
>>
>> This will report all options parsed, and importantly, will also
>> indicate if any options were unused.
>>  
>>
>> Thanks
>> Dave
>>
>> Thanks in advance
>>
>> Myriam
>>
>>
>> Le 03/25/19 à 18:24, Matthew Knepley a écrit :
>>> On Mon, Mar 25, 2019 at 10:54 AM Myriam Peyrounette via
>>> petsc-users >> > wrote:
>>>
>>> Hi,
>>>
>>> thanks for the explanations. I tried the last PETSc
>>> version (commit
>>> fbc5705bc518d02a4999f188aad4ccff5f754cbf), which
>>> includes the patch you talked about. But the memory
>>> scaling shows no improvement (see scaling attached),
>>> even when using the "scalable" options :(
>>>
>>> I had a look at the PETSc functions
>>> MatPtAPNumeric_MPIAIJ_MPIAIJ and
>>> MatPtAPSymbolic_MPIAIJ_MPIAIJ (especially at the
>>> differences before and after the first "bad" commit),
>>> but I can't find what induced this memory issue.
>>>
>>> Are you sure that the option was used? It just looks
>>> suspicious to me that they use exactly the same amount of
>>> memory. It should be different, even if it does not solve
>>> the problem.
>>>
>>>    Thanks,
>>>
>>>      Matt 
>>>
>>> Myriam
>>>
>>>
>>>
>>>
>>> Le 03/20/19 à 17:38, Fande Kong a écrit :
 Hi Myriam,

 There are three algorithms in PETSc to do PtAP ( const
 char          *algTypes[3] =
 {"scalable","nonscalable","hypre"};), and can be
 specified using the petsc options: -matptap_via .

 (1) -matptap_via hypre: This call the hypre package to
 do the PtAP trough an all-at-once triple product. In
 our experiences, it is the most memory efficient, but
 could be slow.

 (2)  -matptap_via scalable: This involves a row-wise
 algorithm plus an outer product.  This will use more
 memory than hypre, but way faster. This used to have a
 bug that could take all your memory, and I have a fix
 at 
 https://bitbucket.org/petsc/petsc/pull-requests/1452/mpiptap-enable-large-scale-simulations/diff.
  
 When using this option, we may want to have extra
 options such as   -inner_offdiag_matmatmult_via
 scalable -inner_diag_matmatmult_via scalable  to select
 inner scalable algorithms.

 (3)  -matptap_via nonscalable:  Suppose to be even
 faster, but use more memory. It does dense matrix
 operations.


 Thanks,

 Fande Kong




 On Wed, Mar 20, 2019 at 10:06 AM Myriam Peyrounette via
 petsc-users >>> > wrote:

 More precisely: something happens when upgrading
 the 

Re: [petsc-users] error: identifier "MatCreateMPIAIJMKL" is undefined in 3.10.4

2019-03-26 Thread Mark Adams via petsc-users
Please send the output of the error (runtime, compile time, link time?)

On Mon, Mar 25, 2019 at 10:50 PM Kun Jiao via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> Hi Petsc Experts,
>
>
>
> Is MatCreateMPIAIJMKL retired in 3.10.4?
>
>
>
> I got this error with my code which works fine in 3.8.3 version.
>
>
>
> Regards,
>
> Kun
>
>
>
> Schlumberger-Private
>


Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Myriam Peyrounette via petsc-users
Oh you were right, the three options are unsused (-matptap_via scalable,
-inner_offdiag_matmatmult_via scalable and -inner_diag_matmatmult_via
scalable). Does this mean I am not using the associated PtAP functions?

Myriam


Le 03/26/19 à 11:10, Dave May a écrit :
>
> On Tue, 26 Mar 2019 at 09:52, Myriam Peyrounette via petsc-users
> mailto:petsc-users@mcs.anl.gov>> wrote:
>
> How can I be sure they are indeed used? Can I print this
> information in some log file?
>
> Yes. Re-run the job with the command line option
>
> -options_left true
>
> This will report all options parsed, and importantly, will also
> indicate if any options were unused.
>  
>
> Thanks
> Dave
>
> Thanks in advance
>
> Myriam
>
>
> Le 03/25/19 à 18:24, Matthew Knepley a écrit :
>> On Mon, Mar 25, 2019 at 10:54 AM Myriam Peyrounette via
>> petsc-users > > wrote:
>>
>> Hi,
>>
>> thanks for the explanations. I tried the last PETSc version
>> (commit fbc5705bc518d02a4999f188aad4ccff5f754cbf), which
>> includes the patch you talked about. But the memory scaling
>> shows no improvement (see scaling attached), even when using
>> the "scalable" options :(
>>
>> I had a look at the PETSc functions
>> MatPtAPNumeric_MPIAIJ_MPIAIJ and
>> MatPtAPSymbolic_MPIAIJ_MPIAIJ (especially at the differences
>> before and after the first "bad" commit), but I can't find
>> what induced this memory issue.
>>
>> Are you sure that the option was used? It just looks suspicious
>> to me that they use exactly the same amount of memory. It should
>> be different, even if it does not solve the problem.
>>
>>    Thanks,
>>
>>      Matt 
>>
>> Myriam
>>
>>
>>
>>
>> Le 03/20/19 à 17:38, Fande Kong a écrit :
>>> Hi Myriam,
>>>
>>> There are three algorithms in PETSc to do PtAP ( const char 
>>>         *algTypes[3] = {"scalable","nonscalable","hypre"};),
>>> and can be specified using the petsc options: -matptap_via .
>>>
>>> (1) -matptap_via hypre: This call the hypre package to do
>>> the PtAP trough an all-at-once triple product. In our
>>> experiences, it is the most memory efficient, but could be slow.
>>>
>>> (2)  -matptap_via scalable: This involves a row-wise
>>> algorithm plus an outer product.  This will use more memory
>>> than hypre, but way faster. This used to have a bug that
>>> could take all your memory, and I have a fix
>>> at 
>>> https://bitbucket.org/petsc/petsc/pull-requests/1452/mpiptap-enable-large-scale-simulations/diff.
>>>  
>>> When using this option, we may want to have extra options
>>> such as   -inner_offdiag_matmatmult_via scalable
>>> -inner_diag_matmatmult_via scalable  to select inner
>>> scalable algorithms.
>>>
>>> (3)  -matptap_via nonscalable:  Suppose to be even faster,
>>> but use more memory. It does dense matrix operations.
>>>
>>>
>>> Thanks,
>>>
>>> Fande Kong
>>>
>>>
>>>
>>>
>>> On Wed, Mar 20, 2019 at 10:06 AM Myriam Peyrounette via
>>> petsc-users >> > wrote:
>>>
>>> More precisely: something happens when upgrading the
>>> functions MatPtAPNumeric_MPIAIJ_MPIAIJ and/or
>>> MatPtAPSymbolic_MPIAIJ_MPIAIJ.
>>>
>>> Unfortunately, there are a lot of differences between
>>> the old and new versions of these functions. I keep
>>> investigating but if you have any idea, please let me know.
>>>
>>> Best,
>>>
>>> Myriam
>>>
>>>
>>> Le 03/20/19 à 13:48, Myriam Peyrounette a écrit :

 Hi all,

 I used git bisect to determine when the memory need
 increased. I found that the first "bad" commit is  
 aa690a28a7284adb519c28cb44eae20a2c131c85.

 Barry was right, this commit seems to be about an
 evolution of MatPtAPSymbolic_MPIAIJ_MPIAIJ. You
 mentioned the option "-matptap_via scalable" but I
 can't find any information about it. Can you tell me more?

 Thanks

 Myriam


 Le 03/11/19 à 14:40, Mark Adams a écrit :
> Is there a difference in memory usage on your tiny
> problem? I assume no.
>
> I don't see anything that could come from GAMG other
> than the RAP stuff that you have discussed already.
>
> On Mon, Mar 11, 2019 at 9:32 AM Myriam Peyrounette
>  > wrote:
>
> The code I am using here is the example 42 of
> PETSc
> 

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Myriam Peyrounette via petsc-users
How can I be sure they are indeed used? Can I print this information in
some log file?

Thanks in advance

Myriam


Le 03/25/19 à 18:24, Matthew Knepley a écrit :
> On Mon, Mar 25, 2019 at 10:54 AM Myriam Peyrounette via petsc-users
> mailto:petsc-users@mcs.anl.gov>> wrote:
>
> Hi,
>
> thanks for the explanations. I tried the last PETSc version
> (commit fbc5705bc518d02a4999f188aad4ccff5f754cbf), which includes
> the patch you talked about. But the memory scaling shows no
> improvement (see scaling attached), even when using the "scalable"
> options :(
>
> I had a look at the PETSc functions MatPtAPNumeric_MPIAIJ_MPIAIJ
> and MatPtAPSymbolic_MPIAIJ_MPIAIJ (especially at the differences
> before and after the first "bad" commit), but I can't find what
> induced this memory issue.
>
> Are you sure that the option was used? It just looks suspicious to me
> that they use exactly the same amount of memory. It should be
> different, even if it does not solve the problem.
>
>    Thanks,
>
>      Matt 
>
> Myriam
>
>
>
>
> Le 03/20/19 à 17:38, Fande Kong a écrit :
>> Hi Myriam,
>>
>> There are three algorithms in PETSc to do PtAP ( const char     
>>     *algTypes[3] = {"scalable","nonscalable","hypre"};), and can
>> be specified using the petsc options: -matptap_via .
>>
>> (1) -matptap_via hypre: This call the hypre package to do the
>> PtAP trough an all-at-once triple product. In our experiences, it
>> is the most memory efficient, but could be slow.
>>
>> (2)  -matptap_via scalable: This involves a row-wise algorithm
>> plus an outer product.  This will use more memory than hypre, but
>> way faster. This used to have a bug that could take all your
>> memory, and I have a fix
>> at 
>> https://bitbucket.org/petsc/petsc/pull-requests/1452/mpiptap-enable-large-scale-simulations/diff.
>>  
>> When using this option, we may want to have extra options such
>> as   -inner_offdiag_matmatmult_via scalable
>> -inner_diag_matmatmult_via scalable  to select inner scalable
>> algorithms.
>>
>> (3)  -matptap_via nonscalable:  Suppose to be even faster, but
>> use more memory. It does dense matrix operations.
>>
>>
>> Thanks,
>>
>> Fande Kong
>>
>>
>>
>>
>> On Wed, Mar 20, 2019 at 10:06 AM Myriam Peyrounette via
>> petsc-users > > wrote:
>>
>> More precisely: something happens when upgrading the
>> functions MatPtAPNumeric_MPIAIJ_MPIAIJ and/or
>> MatPtAPSymbolic_MPIAIJ_MPIAIJ.
>>
>> Unfortunately, there are a lot of differences between the old
>> and new versions of these functions. I keep investigating but
>> if you have any idea, please let me know.
>>
>> Best,
>>
>> Myriam
>>
>>
>> Le 03/20/19 à 13:48, Myriam Peyrounette a écrit :
>>>
>>> Hi all,
>>>
>>> I used git bisect to determine when the memory need
>>> increased. I found that the first "bad" commit is  
>>> aa690a28a7284adb519c28cb44eae20a2c131c85.
>>>
>>> Barry was right, this commit seems to be about an evolution
>>> of MatPtAPSymbolic_MPIAIJ_MPIAIJ. You mentioned the option
>>> "-matptap_via scalable" but I can't find any information
>>> about it. Can you tell me more?
>>>
>>> Thanks
>>>
>>> Myriam
>>>
>>>
>>> Le 03/11/19 à 14:40, Mark Adams a écrit :
 Is there a difference in memory usage on your tiny problem?
 I assume no.

 I don't see anything that could come from GAMG other than
 the RAP stuff that you have discussed already.

 On Mon, Mar 11, 2019 at 9:32 AM Myriam Peyrounette
 >>> > wrote:

 The code I am using here is the example 42 of PETSc
 
 (https://www.mcs.anl.gov/petsc/petsc-3.9/src/ksp/ksp/examples/tutorials/ex42.c.html).
 Indeed it solves the Stokes equation. I thought it was
 a good idea to use an example you might know (and
 didn't find any that uses GAMG functions). I just
 changed the PCMG setup so that the memory problem
 appears. And it appears when adding PCGAMG.

 I don't care about the performance or even the result
 rightness here, but only about the difference in memory
 use between 3.6 and 3.10. Do you think finding a more
 adapted script would help?

 I used the threshold of 0.1 only once, at the
 beginning, to test its influence. I used the default
 threshold (of 0, I guess) for all the other runs.

 Myriam


 Le 03/11/19 à 13:52, Mark Adams a écrit :
> In looking at this larger scale 

Re: [petsc-users] Solving block systems with some null diagonal blocks

2019-03-26 Thread Manuel Colera Rico via petsc-users

OK, thank you Matt.

Manuel

---

On 3/25/19 6:27 PM, Matthew Knepley wrote:
On Mon, Mar 25, 2019 at 8:07 AM Manuel Colera Rico via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:


Hello,

I would like to solve a N*N block system (with N>2) in which some
of the
diagonal blocks are null. My system matrix is defined as a
MatNest. As
N>2, I can't use "pc_fieldsplit_type schur" nor
"pc_fieldsplit_detect_saddle_point". The other algorithms
("additive",
"multiplicative" and "symmetric_multiplicative") don't work either as
they need each A_ii to be non-zero.

Is there any built-in function in PETSc for this? If not, could you
please suggest a workaround?


You can just shove all of the rows with nonzero diagonal in one field, 
and all with zero diagonal in another, and do Schur. This is what


  -pc_fieldsplit_detect_saddle_point

does. However, you have to understand the Schur complement to solve it 
efficiently. More generally, you can recursively split the matrix,

which is what I do for many multiphysics problems.

  Thanks,

    Matt

Thanks and kind regards,

Manuel

---




--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/