Re: [petsc-users] Undefined symbols for architecture x86_64: "_dmviewfromoptions_",

2019-09-20 Thread Smith, Barry F. via petsc-users


  Oh yes, I didn't notice that. The stubs and interfaces cannot be generated 
automatically, but cut, paste, and make a mistake will work.



> On Sep 20, 2019, at 9:35 PM, Jed Brown  wrote:
> 
> "Smith, Barry F. via petsc-users"  writes:
> 
>>   Currently none of the XXXViewFromOptions() have manual pages or Fortran 
>> stubs/interfaces. It is probably easier to remove them as inline functions 
>> and instead write them as full functions which just call 
>> PetscObjectViewFromOptions() with manual pages then the Fortran 
>> stubs/interfaces will be built automatically.
> 
> PetscObjectViewFromOptions has a custom interface because it takes a string.
> 
> Fortran users could call that today, rather than wait for stubs to be
> written.



Re: [petsc-users] Undefined symbols for architecture x86_64: "_dmviewfromoptions_",

2019-09-20 Thread Jed Brown via petsc-users
"Smith, Barry F. via petsc-users"  writes:

>Currently none of the XXXViewFromOptions() have manual pages or Fortran 
> stubs/interfaces. It is probably easier to remove them as inline functions 
> and instead write them as full functions which just call 
> PetscObjectViewFromOptions() with manual pages then the Fortran 
> stubs/interfaces will be built automatically.

PetscObjectViewFromOptions has a custom interface because it takes a string.

Fortran users could call that today, rather than wait for stubs to be
written.


Re: [petsc-users] Undefined symbols for architecture x86_64: "_dmviewfromoptions_",

2019-09-20 Thread Smith, Barry F. via petsc-users



   Currently none of the XXXViewFromOptions() have manual pages or Fortran 
stubs/interfaces. It is probably easier to remove them as inline functions and 
instead write them as full functions which just call 
PetscObjectViewFromOptions() with manual pages then the Fortran 
stubs/interfaces will be built automatically.

   Barry





> On Sep 20, 2019, at 6:16 PM, Mark Adams via petsc-users 
>  wrote:
> 
> DMViewFromOptions does not seem to have Fortran bindings and I don't see it 
> on the web page for DM methods.
> 
> I was able to get it to compile using PetscObjectViewFromOptions
> 
> FYI,
> It seems to be an inlined thing, thus missing the web page and Fortran 
> bindings:
> 
> include/petscdm.h:PETSC_STATIC_INLINE PetscErrorCode DMViewFromOptions(DM 
> A,PetscObject obj,const char name[]) {return 
> PetscObjectViewFromOptions((PetscObject)A,obj,name);}
> 
> 
> 
> 18:53 2 mark/feature-xgc-interface *+ 
> ~/Codes/petsc/src/dm/impls/plex/examples/tutorials$ make ex6f90
> /Users/markadams/homebrew/Cellar/mpich/3.3.1/bin/mpif90 
> -Wl,-multiply_defined,suppress -Wl,-multiply_defined -Wl,suppress 
> -Wl,-commons,use_dylibs -Wl,-search_paths_first -Wl,-no_compact_unwind  -Wall 
> -ffree-line-length-0 -Wno-unused-dummy-argument -g   
> -I/Users/markadams/Codes/petsc/include 
> -I/Users/markadams/Codes/petsc/arch-macosx-gnu-g/include -I/opt/X11/include 
> -I/Users/markadams/homebrew/Cellar/mpich/3.3.1/include  ex6f90.F90  
> -Wl,-rpath,/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib 
> -L/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib 
> -Wl,-rpath,/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib 
> -L/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib -Wl,-rpath,/opt/X11/lib 
> -L/opt/X11/lib -Wl,-rpath,/Users/markadams/homebrew/Cellar/mpich/3.3.1/lib 
> -L/Users/markadams/homebrew/Cellar/mpich/3.3.1/lib 
> -Wl,-rpath,/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9/gcc/x86_64-apple-darwin18/9.1.0
>  
> -L/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9/gcc/x86_64-apple-darwin18/9.1.0
>  -Wl,-rpath,/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9 
> -L/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9 -lpetsc -lHYPRE 
> -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack 
> -lsuperlu -lsuperlu_dist -lfftw3_mpi -lfftw3 -lp4est -lsc -llapack -lblas 
> -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lchaco -lparmetis -lmetis 
> -ltriangle -lz -lX11 -lctetgen -lc++ -ldl -lmpifort -lmpi -lpmpi -lgfortran 
> -lquadmath -lm -lc++ -ldl -o ex6f90
> Undefined symbols for architecture x86_64:
>   "_dmviewfromoptions_", referenced from:
>   _MAIN__ in ccALMXJ2.o
> ld: symbol(s) not found for architecture x86_64
> collect2: error: ld returned 1 exit status
> make: *** [ex6f90] Error 1



Re: [petsc-users] reproduced the problem

2019-09-20 Thread Balay, Satish via petsc-users
As the message says - you need to use configure option --with-cxx-dialect=C++11 
with --download-superlu_dist

[this requirement is automated in petsc/master so extra configure option is no 
longer required]

Satish

On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-users wrote:

> Hello Satish,
> 
> I did what you suggested, now the error is different:
> 
> 
>      UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for 
> details):
> ---
> Cannot use SuperLU_DIST without enabling C++11, see --with-cxx-dialect=C++11
> ***
> 
> The updated configure.log is here:
> 
> https://www.dropbox.com/s/tmkksemu294j719/configure.log?dl=0
> 
> On 9/20/2019 4:32 PM, Balay, Satish wrote:
> > 
> > TEST checkRuntimeIssues from 
> > config.packages.BlasLapack(/depot/kildisha/apps/brown/nemo5/libs/petsc/build-real3.11/config/BuildSystem/config/packages/BlasLapack.py:579)
> > TESTING: checkRuntimeIssues from 
> > config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:579)
> >Determines if BLAS/LAPACK routines use 32 or 64 bit integers
> > Checking if BLAS/LAPACK routines use 32 or 64 bit integersExecuting: mpicc 
> > -c -o /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.o 
> > -I/tmp/petsc-wf99X2/config.setCompilers 
> > -I/tmp/petsc-wf99X2/config.compilers 
> > -I/tmp/petsc-wf99X2/config.utilities.closure 
> > -I/tmp/petsc-wf99X2/config.headers 
> > -I/tmp/petsc-wf99X2/config.utilities.cacheDetails 
> > -I/tmp/petsc-wf99X2/config.atomics -I/tmp/petsc-wf99X2/config.libraries 
> > -I/tmp/petsc-wf99X2/config.functions 
> > -I/tmp/petsc-wf99X2/config.utilities.featureTestMacros 
> > -I/tmp/petsc-wf99X2/config.utilities.missing 
> > -I/tmp/petsc-wf99X2/config.types -I/tmp/petsc-wf99X2/config.packages.MPI 
> > -I/tmp/petsc-wf99X2/config.packages.valgrind 
> > -I/tmp/petsc-wf99X2/config.packages.pthread 
> > -I/tmp/petsc-wf99X2/config.packages.metis 
> > -I/tmp/petsc-wf99X2/config.packages.hdf5 
> > -I/tmp/petsc-wf99X2/config.packages.BlasLapack -fopenmp -fPIC  
> > /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.c
> > Successful compile:
> > Source:
> > #include "confdefs.h"
> > #include "conffix.h"
> > #include 
> > #if STDC_HEADERS
> > #include 
> > #include 
> > #include 
> > #endif
> >
> > int main() {
> > FILE *output = fopen("runtimetestoutput","w");
> > extern double ddot_(const int*,const double*,const int *,const 
> > double*,const int*);
> >double x1mkl[4] = {3.0,5.0,7.0,9.0};
> >int one1mkl = 1,nmkl = 2;
> >double dotresultmkl = 0;
> >dotresultmkl = 
> > ddot_(,x1mkl,,x1mkl,);
> >fprintf(output, 
> > "-known-64-bit-blas-indices=%d",dotresultmkl != 34);;
> >return 0;
> > }
> > Executing: mpicc  -o /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest  
> >  -fopenmp -fPIC /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.o 
> > -L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64
> >  -lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core -lm -lstdc++ -ldl 
> > -L/apps/brown/openmpi.20190215/2.1.6_gcc-5.2.0/lib -lmpi_usempif08 
> > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm 
> > -L/apps/cent7/gcc/5.2.0/lib/gcc/x86_64-unknown-linux-gnu/5.2.0 
> > -L/apps/cent7/gcc/5.2.0/lib64 -L/apps/cent7/gcc/5.2.0/lib 
> > -Wl,-rpath,/apps/brown/openmpi.20190215/2.1.6_gcc-5.2.0/lib -lgfortran -lm 
> > -lgomp -lgcc_s -lquadmath -lpthread -lstdc++ -ldl
> > Testing executable /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest to 
> > see if it can be run
> > Executing: /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest
> > Executing: /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest
> > ERROR while running executable: Could not execute 
> > "['/tmp/petsc-wf99X2/config.packages.BlasLapack/conftest']":
> > /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest: error while loading 
> > shared libraries: libmkl_intel_lp64.so: cannot open shared object file: No 
> > such file or directory
> >
> >  Defined "HAVE_64BIT_BLAS_INDICES" to "1"
> > Checking for 64 bit blas indices: program did not return therefor assuming 
> > 64 bit blas indices
> >  Defined "HAVE_LIBMKL_INTEL_ILP64" to "1"
> >
> > 
> >
> > So this test has an error but yet the flag HAVE_64BIT_BLAS_INDICES is set.
> >
> > Is your compiler not returning correct error codes?
> >
> > Does it make a difference if you also specify -Wl,-rpath along with -L in 
> > --with-blaslapack-lib option?
> >
> >
> > Satish
> >
> > On Fri, 20 Sep 2019, Povolotskyi, Mykhailo wrote:
> >
> >> Dear Matthew and Satish,
> >>
> >> I just wrote that the error disappeared, but it still exists (I had to
> >> wait longer).
> >>
> >> The configuration log 

[petsc-users] Undefined symbols for architecture x86_64: "_dmviewfromoptions_",

2019-09-20 Thread Mark Adams via petsc-users
DMViewFromOptions does not seem to have Fortran bindings and I don't see it
on the web page for DM methods.

I was able to get it to compile using PetscObjectViewFromOptions

FYI,
It seems to be an inlined thing, thus missing the web page and Fortran
bindings:

include/petscdm.h:PETSC_STATIC_INLINE PetscErrorCode DMViewFromOptions(DM
A,PetscObject obj,const char name[]) {return
PetscObjectViewFromOptions((PetscObject)A,obj,name);}



18:53 2 mark/feature-xgc-interface *+
~/Codes/petsc/src/dm/impls/plex/examples/tutorials$ make ex6f90
/Users/markadams/homebrew/Cellar/mpich/3.3.1/bin/mpif90
-Wl,-multiply_defined,suppress -Wl,-multiply_defined -Wl,suppress
-Wl,-commons,use_dylibs -Wl,-search_paths_first -Wl,-no_compact_unwind
 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g
-I/Users/markadams/Codes/petsc/include
-I/Users/markadams/Codes/petsc/arch-macosx-gnu-g/include -I/opt/X11/include
-I/Users/markadams/homebrew/Cellar/mpich/3.3.1/include  ex6f90.F90
 -Wl,-rpath,/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib
-L/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib
-Wl,-rpath,/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib
-L/Users/markadams/Codes/petsc/arch-macosx-gnu-g/lib
-Wl,-rpath,/opt/X11/lib -L/opt/X11/lib
-Wl,-rpath,/Users/markadams/homebrew/Cellar/mpich/3.3.1/lib
-L/Users/markadams/homebrew/Cellar/mpich/3.3.1/lib
-Wl,-rpath,/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9/gcc/x86_64-apple-darwin18/9.1.0
-L/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9/gcc/x86_64-apple-darwin18/9.1.0
-Wl,-rpath,/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9
-L/Users/markadams/homebrew/Cellar/gcc/9.1.0/lib/gcc/9 -lpetsc -lHYPRE
-lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack
-lsuperlu -lsuperlu_dist -lfftw3_mpi -lfftw3 -lp4est -lsc -llapack -lblas
-lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lchaco -lparmetis -lmetis
-ltriangle -lz -lX11 -lctetgen -lc++ -ldl -lmpifort -lmpi -lpmpi -lgfortran
-lquadmath -lm -lc++ -ldl -o ex6f90
Undefined symbols for architecture x86_64:
  "_dmviewfromoptions_", referenced from:
  _MAIN__ in ccALMXJ2.o
ld: symbol(s) not found for architecture x86_64
collect2: error: ld returned 1 exit status
make: *** [ex6f90] Error 1


Re: [petsc-users] reproduced the problem

2019-09-20 Thread Povolotskyi, Mykhailo via petsc-users
Hello Satish,

I did what you suggested, now the error is different:


     UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for 
details):
---
Cannot use SuperLU_DIST without enabling C++11, see --with-cxx-dialect=C++11
***

The updated configure.log is here:

https://www.dropbox.com/s/tmkksemu294j719/configure.log?dl=0

On 9/20/2019 4:32 PM, Balay, Satish wrote:
> 
> TEST checkRuntimeIssues from 
> config.packages.BlasLapack(/depot/kildisha/apps/brown/nemo5/libs/petsc/build-real3.11/config/BuildSystem/config/packages/BlasLapack.py:579)
> TESTING: checkRuntimeIssues from 
> config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:579)
>Determines if BLAS/LAPACK routines use 32 or 64 bit integers
> Checking if BLAS/LAPACK routines use 32 or 64 bit integersExecuting: mpicc -c 
> -o /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.o 
> -I/tmp/petsc-wf99X2/config.setCompilers -I/tmp/petsc-wf99X2/config.compilers 
> -I/tmp/petsc-wf99X2/config.utilities.closure 
> -I/tmp/petsc-wf99X2/config.headers 
> -I/tmp/petsc-wf99X2/config.utilities.cacheDetails 
> -I/tmp/petsc-wf99X2/config.atomics -I/tmp/petsc-wf99X2/config.libraries 
> -I/tmp/petsc-wf99X2/config.functions 
> -I/tmp/petsc-wf99X2/config.utilities.featureTestMacros 
> -I/tmp/petsc-wf99X2/config.utilities.missing -I/tmp/petsc-wf99X2/config.types 
> -I/tmp/petsc-wf99X2/config.packages.MPI 
> -I/tmp/petsc-wf99X2/config.packages.valgrind 
> -I/tmp/petsc-wf99X2/config.packages.pthread 
> -I/tmp/petsc-wf99X2/config.packages.metis 
> -I/tmp/petsc-wf99X2/config.packages.hdf5 
> -I/tmp/petsc-wf99X2/config.packages.BlasLapack -fopenmp -fPIC  
> /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.c
> Successful compile:
> Source:
> #include "confdefs.h"
> #include "conffix.h"
> #include 
> #if STDC_HEADERS
> #include 
> #include 
> #include 
> #endif
>
> int main() {
> FILE *output = fopen("runtimetestoutput","w");
> extern double ddot_(const int*,const double*,const int *,const double*,const 
> int*);
>double x1mkl[4] = {3.0,5.0,7.0,9.0};
>int one1mkl = 1,nmkl = 2;
>double dotresultmkl = 0;
>dotresultmkl = ddot_(,x1mkl,,x1mkl,);
>fprintf(output, 
> "-known-64-bit-blas-indices=%d",dotresultmkl != 34);;
>return 0;
> }
> Executing: mpicc  -o /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest   
> -fopenmp -fPIC /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.o 
> -L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
> -lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core -lm -lstdc++ -ldl 
> -L/apps/brown/openmpi.20190215/2.1.6_gcc-5.2.0/lib -lmpi_usempif08 
> -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm 
> -L/apps/cent7/gcc/5.2.0/lib/gcc/x86_64-unknown-linux-gnu/5.2.0 
> -L/apps/cent7/gcc/5.2.0/lib64 -L/apps/cent7/gcc/5.2.0/lib 
> -Wl,-rpath,/apps/brown/openmpi.20190215/2.1.6_gcc-5.2.0/lib -lgfortran -lm 
> -lgomp -lgcc_s -lquadmath -lpthread -lstdc++ -ldl
> Testing executable /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest to 
> see if it can be run
> Executing: /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest
> Executing: /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest
> ERROR while running executable: Could not execute 
> "['/tmp/petsc-wf99X2/config.packages.BlasLapack/conftest']":
> /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest: error while loading 
> shared libraries: libmkl_intel_lp64.so: cannot open shared object file: No 
> such file or directory
>
>  Defined "HAVE_64BIT_BLAS_INDICES" to "1"
> Checking for 64 bit blas indices: program did not return therefor assuming 64 
> bit blas indices
>  Defined "HAVE_LIBMKL_INTEL_ILP64" to "1"
>
> 
>
> So this test has an error but yet the flag HAVE_64BIT_BLAS_INDICES is set.
>
> Is your compiler not returning correct error codes?
>
> Does it make a difference if you also specify -Wl,-rpath along with -L in 
> --with-blaslapack-lib option?
>
>
> Satish
>
> On Fri, 20 Sep 2019, Povolotskyi, Mykhailo wrote:
>
>> Dear Matthew and Satish,
>>
>> I just wrote that the error disappeared, but it still exists (I had to
>> wait longer).
>>
>> The configuration log can be accessed here:
>>
>> https://www.dropbox.com/s/tmkksemu294j719/configure.log?dl=0
>>
>> Sorry for the last e-mail.
>>
>> Michael.
>>
>>
>> On 09/20/2019 03:53 PM, Balay, Satish wrote:
>>> --with-64-bit-indices=1 => PetscInt = int64_t
>>> --known-64-bit-blas-indices=1 => blas specified uses 64bit indices.
>>>
>>> What is your requirement (use case)?
>>>
>>> Satish
>>>
>>> On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>>>
 Does it mean I have to configure petsc with 

Re: [petsc-users] Reading in the full matrix in one process and then trying to solve in parallel with PETSc

2019-09-20 Thread Jed Brown via petsc-users
Matthew Knepley via petsc-users  writes:

> On Fri, Sep 20, 2019 at 7:54 AM Bao Kai via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hi,
>>
>> I understand that PETSc is not designed to be used this way, while I
>> am wondering if someone have done something similar to this.
>>
>> We have the full matrix from a simulation and rhs vector. We would
>> like to read in through PETSc in one process, then we use some
>> partition functions to partition the matrix.
>>
>> Based on the partition information, we redistribute the matrix among
>> the processes. Then we solve it in parallel.  It is for testing the
>> performance of some parallel linear solver and preconditions.
>>
>> We are not in the position to develop a full parallel implementation
>> of the simulator yet.

An alternative is to assemble a Mat living on a parallel communicator,
but with all entries on rank 0 (so just call your serial code to build
the matrix).  You can do the same for your vector, then KSPSolve.  To
make the solver parallel, just use run-time options:

 -ksp_type preonly -pc_type redistribute

will redistribute automatically inside the solver and return the
solution vector to you on rank 0.  You can control the inner solver via
prefix

 -redistribute_ksp_type gmres -redistribute_pc_type gamg 
-redistibute_ksp_monitor


Re: [petsc-users] reproduced the problem

2019-09-20 Thread Balay, Satish via petsc-users
>

TEST checkRuntimeIssues from 
config.packages.BlasLapack(/depot/kildisha/apps/brown/nemo5/libs/petsc/build-real3.11/config/BuildSystem/config/packages/BlasLapack.py:579)
TESTING: checkRuntimeIssues from 
config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:579)
  Determines if BLAS/LAPACK routines use 32 or 64 bit integers
Checking if BLAS/LAPACK routines use 32 or 64 bit integersExecuting: mpicc -c 
-o /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.o 
-I/tmp/petsc-wf99X2/config.setCompilers -I/tmp/petsc-wf99X2/config.compilers 
-I/tmp/petsc-wf99X2/config.utilities.closure -I/tmp/petsc-wf99X2/config.headers 
-I/tmp/petsc-wf99X2/config.utilities.cacheDetails 
-I/tmp/petsc-wf99X2/config.atomics -I/tmp/petsc-wf99X2/config.libraries 
-I/tmp/petsc-wf99X2/config.functions 
-I/tmp/petsc-wf99X2/config.utilities.featureTestMacros 
-I/tmp/petsc-wf99X2/config.utilities.missing -I/tmp/petsc-wf99X2/config.types 
-I/tmp/petsc-wf99X2/config.packages.MPI 
-I/tmp/petsc-wf99X2/config.packages.valgrind 
-I/tmp/petsc-wf99X2/config.packages.pthread 
-I/tmp/petsc-wf99X2/config.packages.metis 
-I/tmp/petsc-wf99X2/config.packages.hdf5 
-I/tmp/petsc-wf99X2/config.packages.BlasLapack -fopenmp -fPIC  
/tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.c 
Successful compile:
Source:
#include "confdefs.h"
#include "conffix.h"
#include 
#if STDC_HEADERS
#include 
#include 
#include 
#endif

int main() {
FILE *output = fopen("runtimetestoutput","w");
extern double ddot_(const int*,const double*,const int *,const double*,const 
int*);
  double x1mkl[4] = {3.0,5.0,7.0,9.0};
  int one1mkl = 1,nmkl = 2;
  double dotresultmkl = 0;
  dotresultmkl = ddot_(,x1mkl,,x1mkl,);
  fprintf(output, "-known-64-bit-blas-indices=%d",dotresultmkl 
!= 34);;
  return 0;
}
Executing: mpicc  -o /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest   
-fopenmp -fPIC /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest.o 
-L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
-lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core -lm -lstdc++ -ldl 
-L/apps/brown/openmpi.20190215/2.1.6_gcc-5.2.0/lib -lmpi_usempif08 
-lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm 
-L/apps/cent7/gcc/5.2.0/lib/gcc/x86_64-unknown-linux-gnu/5.2.0 
-L/apps/cent7/gcc/5.2.0/lib64 -L/apps/cent7/gcc/5.2.0/lib 
-Wl,-rpath,/apps/brown/openmpi.20190215/2.1.6_gcc-5.2.0/lib -lgfortran -lm 
-lgomp -lgcc_s -lquadmath -lpthread -lstdc++ -ldl 
Testing executable /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest to see 
if it can be run
Executing: /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest
Executing: /tmp/petsc-wf99X2/config.packages.BlasLapack/conftest
ERROR while running executable: Could not execute 
"['/tmp/petsc-wf99X2/config.packages.BlasLapack/conftest']":
/tmp/petsc-wf99X2/config.packages.BlasLapack/conftest: error while loading 
shared libraries: libmkl_intel_lp64.so: cannot open shared object file: No such 
file or directory

Defined "HAVE_64BIT_BLAS_INDICES" to "1"
Checking for 64 bit blas indices: program did not return therefor assuming 64 
bit blas indices
Defined "HAVE_LIBMKL_INTEL_ILP64" to "1"



So this test has an error but yet the flag HAVE_64BIT_BLAS_INDICES is set.

Is your compiler not returning correct error codes?

Does it make a difference if you also specify -Wl,-rpath along with -L in 
--with-blaslapack-lib option?


Satish

On Fri, 20 Sep 2019, Povolotskyi, Mykhailo wrote:

> Dear Matthew and Satish,
> 
> I just wrote that the error disappeared, but it still exists (I had to 
> wait longer).
> 
> The configuration log can be accessed here:
> 
> https://www.dropbox.com/s/tmkksemu294j719/configure.log?dl=0
> 
> Sorry for the last e-mail.
> 
> Michael.
> 
> 
> On 09/20/2019 03:53 PM, Balay, Satish wrote:
> > --with-64-bit-indices=1 => PetscInt = int64_t
> > --known-64-bit-blas-indices=1 => blas specified uses 64bit indices.
> >
> > What is your requirement (use case)?
> >
> > Satish
> >
> > On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-users wrote:
> >
> >> Does it mean I have to configure petsc with --with-64-bit-indices=1 ?
> >>
> >> On 09/20/2019 03:41 PM, Matthew Knepley wrote:
> >> On Fri, Sep 20, 2019 at 1:55 PM Povolotskyi, Mykhailo via petsc-users 
> >> mailto:petsc-users@mcs.anl.gov>> wrote:
> >> Hello,
> >>
> >> I'm upgrading petsc from 3.8 to 3.11.
> >>
> >> In doing so, I see an error message:
> >>
> >>UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> >> details):
> >> ---
> >> Cannot use SuperLU_DIST with 64 bit BLAS/Lapack indices
> >> ***
> >>
> >> I wonder why this configuration step worked well for 3.8?  

[petsc-users] reproduced the problem

2019-09-20 Thread Povolotskyi, Mykhailo via petsc-users
Dear Matthew and Satish,

I just wrote that the error disappeared, but it still exists (I had to 
wait longer).

The configuration log can be accessed here:

https://www.dropbox.com/s/tmkksemu294j719/configure.log?dl=0

Sorry for the last e-mail.

Michael.


On 09/20/2019 03:53 PM, Balay, Satish wrote:
> --with-64-bit-indices=1 => PetscInt = int64_t
> --known-64-bit-blas-indices=1 => blas specified uses 64bit indices.
>
> What is your requirement (use case)?
>
> Satish
>
> On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>
>> Does it mean I have to configure petsc with --with-64-bit-indices=1 ?
>>
>> On 09/20/2019 03:41 PM, Matthew Knepley wrote:
>> On Fri, Sep 20, 2019 at 1:55 PM Povolotskyi, Mykhailo via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> wrote:
>> Hello,
>>
>> I'm upgrading petsc from 3.8 to 3.11.
>>
>> In doing so, I see an error message:
>>
>>UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for details):
>> ---
>> Cannot use SuperLU_DIST with 64 bit BLAS/Lapack indices
>> ***
>>
>> I wonder why this configuration step worked well for 3.8?  I did not
>> change anything else but version of petsc.
>>
>> This never worked. We are just checking now.
>>
>>Thanks,
>>
>>  Matt
>>
>> Thank you,
>>
>> Michael.
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their experiments 
>> is infinitely more interesting than any results to which their experiments 
>> lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>>
>>



Re: [petsc-users] question about installing petsc3.11

2019-09-20 Thread Povolotskyi, Mykhailo via petsc-users
I have to apologize.

By mistake I was installing the new version in the directory where the 
old version already existed. After I cleaned everything, I do not see 
that error message anymore.

Yes, the error message was somewhat misleading, but I will not be able 
to reproduce it.

Michael.



On 09/20/2019 03:53 PM, Balay, Satish wrote:
> --with-64-bit-indices=1 => PetscInt = int64_t
> --known-64-bit-blas-indices=1 => blas specified uses 64bit indices.
>
> What is your requirement (use case)?
>
> Satish
>
> On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>
>> Does it mean I have to configure petsc with --with-64-bit-indices=1 ?
>>
>> On 09/20/2019 03:41 PM, Matthew Knepley wrote:
>> On Fri, Sep 20, 2019 at 1:55 PM Povolotskyi, Mykhailo via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> wrote:
>> Hello,
>>
>> I'm upgrading petsc from 3.8 to 3.11.
>>
>> In doing so, I see an error message:
>>
>>UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for details):
>> ---
>> Cannot use SuperLU_DIST with 64 bit BLAS/Lapack indices
>> ***
>>
>> I wonder why this configuration step worked well for 3.8?  I did not
>> change anything else but version of petsc.
>>
>> This never worked. We are just checking now.
>>
>>Thanks,
>>
>>  Matt
>>
>> Thank you,
>>
>> Michael.
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their experiments 
>> is infinitely more interesting than any results to which their experiments 
>> lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>>
>>



Re: [petsc-users] question about installing petsc3.11

2019-09-20 Thread Balay, Satish via petsc-users
--with-64-bit-indices=1 => PetscInt = int64_t
--known-64-bit-blas-indices=1 => blas specified uses 64bit indices.

What is your requirement (use case)?

Satish

On Fri, 20 Sep 2019, Povolotskyi, Mykhailo via petsc-users wrote:

> Does it mean I have to configure petsc with --with-64-bit-indices=1 ?
> 
> On 09/20/2019 03:41 PM, Matthew Knepley wrote:
> On Fri, Sep 20, 2019 at 1:55 PM Povolotskyi, Mykhailo via petsc-users 
> mailto:petsc-users@mcs.anl.gov>> wrote:
> Hello,
> 
> I'm upgrading petsc from 3.8 to 3.11.
> 
> In doing so, I see an error message:
> 
>   UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for details):
> ---
> Cannot use SuperLU_DIST with 64 bit BLAS/Lapack indices
> ***
> 
> I wonder why this configuration step worked well for 3.8?  I did not
> change anything else but version of petsc.
> 
> This never worked. We are just checking now.
> 
>   Thanks,
> 
> Matt
> 
> Thank you,
> 
> Michael.
> 
> 
> 
> --
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/
> 
> 



Re: [petsc-users] question about installing petsc3.11

2019-09-20 Thread Povolotskyi, Mykhailo via petsc-users
Does it mean I have to configure petsc with --with-64-bit-indices=1 ?

On 09/20/2019 03:41 PM, Matthew Knepley wrote:
On Fri, Sep 20, 2019 at 1:55 PM Povolotskyi, Mykhailo via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello,

I'm upgrading petsc from 3.8 to 3.11.

In doing so, I see an error message:

  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for details):
---
Cannot use SuperLU_DIST with 64 bit BLAS/Lapack indices
***

I wonder why this configuration step worked well for 3.8?  I did not
change anything else but version of petsc.

This never worked. We are just checking now.

  Thanks,

Matt

Thank you,

Michael.



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/



[petsc-users] question about installing petsc3.11

2019-09-20 Thread Povolotskyi, Mykhailo via petsc-users
Hello,

I'm upgrading petsc from 3.8 to 3.11.

In doing so, I see an error message:

  UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for details):
---
Cannot use SuperLU_DIST with 64 bit BLAS/Lapack indices
***

I wonder why this configuration step worked well for 3.8?  I did not 
change anything else but version of petsc.

Thank you,

Michael.



Re: [petsc-users] question about MatCreateRedundantMatrix

2019-09-20 Thread Jose E. Roman via petsc-users
I have tried with slepc-master and it works:

$ mpiexec -n 2 ./ex1 -eps_ciss_partitions 2
matrix size 774
(-78.7875,8.8022)
(-73.9569,-42.2401)
(-66.9942,-7.50907)
(-62.262,-2.71603)
(-58.9716,0.60)
(-57.9883,0.298729)
(-57.8323,1.06041)
(-56.5317,1.10758)
(-56.0234,45.2405)
(-54.4058,2.88373)
(-25.946,26.0317)
(-23.5383,-16.9096)
(-19.0999,0.194467)
(-18.795,1.15113)
(-15.3051,0.915914)
(-14.803,-0.00475538)
(-8.52467,10.6032)
(-4.36051,2.29996)
(-0.525758,0.796658)
(1.41227,0.112858)
(1.53801,0.446984)
(9.43357,0.505277)

slepc-master will become version 3.12 in a few days. I have not tried with 3.11 
but I think it should work.

It is always recommended to use the latest version. Version 3.8 is two years 
old.

Jose


> El 19 sept 2019, a las 20:33, Povolotskyi, Mykhailo  
> escribió:
> 
> Hong,
> 
> do you have in mind a reason why the newer version should work or is it a 
> general recommendation?
> 
> Which stable version would you recommend to upgrade to?
> 
> Thank you,
> 
> Michael.
> 
> 
> On 09/19/2019 02:22 PM, Zhang, Hong wrote:
>> Michael,
>> 
>> --
>> [0]PETSC ERROR: No support for this operation for this object type
>> [0]PETSC ERROR: Mat type seqdense
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
>> for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.8.4, Mar, 24, 2018
>> 
>> This is an old version of  Petsc. Can you update to the latest Petsc release?
>> Hong
>> 
>> 
>> On 09/19/2019 04:55 AM, Jose E. Roman wrote:
>> > Michael,
>> >
>> > In my previous email I should have checked it better. The CISS solver 
>> > works indeed with dense matrices:
>> >
>> > $ mpiexec -n 2 ./ex2 -n 30 -eps_type ciss -terse -rg_type ellipse 
>> > -rg_ellipse_center 1.175 -rg_ellipse_radius 0.075 -eps_ciss_partitions 2 
>> > -mat_type dense
>> >
>> > 2-D Laplacian Eigenproblem, N=900 (30x30 grid)
>> >
>> >   Solution method: ciss
>> >
>> >   Number of requested eigenvalues: 1
>> >   Found 15 eigenvalues, all of them computed up to the required tolerance:
>> >   1.10416, 1.10416, 1.10455, 1.10455, 1.12947, 1.12947, 1.13426, 
>> > 1.13426,
>> >   1.16015, 1.16015, 1.19338, 1.19338, 1.21093, 1.21093, 1.24413
>> >
>> >
>> > There might be something different in the way matrices are initialized in 
>> > your code. Send me a simple example that reproduces the problem and I will 
>> > track it down.
>> >
>> > Sorry for the confusion.
>> > Jose
>> >
>> >
>> >
>> >> El 19 sept 2019, a las 6:20, hong--- via petsc-users 
>> >>  escribió:
>> >>
>> >> Michael,
>> >> We have support of MatCreateRedundantMatrix for dense matrices. For 
>> >> example, petsc/src/mat/examples/tests/ex9.c:
>> >> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2
>> >>
>> >> Hong
>> >>
>> >> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users 
>> >>  wrote:
>> >> Dear Petsc developers,
>> >>
>> >> I found that MatCreateRedundantMatrix does not support dense matrices.
>> >>
>> >> This causes the following problem: I cannot use CISS eigensolver from
>> >> SLEPC with dense matrices with parallelization over quadrature points.
>> >>
>> >> Is it possible for you to add this support?
>> >>
>> >> Thank you,
>> >>
>> >> Michael.
>> >>
>> >>
>> >> p.s. I apologize if you received this e-mail twice, I sent if first from
>> >> a different address.
>> >>
>> 
>