Hi

I use two mpi clusters (cluster 1 and 2). Whereas the petsc binary files I 
generate can be read on cluster 1, I get errors doing so on cluster 2. I also 
output vts files corresponding to each binary file output, and it appears that 
both clusters do produce meaningful results. I use ver 3.7.4 on cluster 1, and 
3.7.5 on 2. Also, the same simulation produces binary files of slightly 
different sizes on the two clusters. Can you comment on what I need to do to be 
able to read binary files on cluster 2.

thanks
Sanjay

Code snippet from parallel petsc code that does output:

if(time_int % (int)(1.0/DELTAT) == 0){ // a smaller time step and more files 
outputted makes the CV estimate better.
sprintf(str,"my_2d%d.vts", file_Counter); // this confirms that simulation does 
something meaningful.
PetscViewer viewer;  PetscViewerCreate(PETSC_COMM_WORLD, &viewer); 
PetscViewerSetType(viewer, PETSCVIEWERVTK);
PetscViewerFileSetName(viewer, str); VecView(u, viewer); 
PetscViewerDestroy(&viewer);
                 sprintf(str,"my_3d%d.bin",(int)file_Counter);
PetscViewer viewer2;
PetscViewerBinaryOpen(PETSC_COMM_WORLD,str,FILE_MODE_WRITE,&viewer2);
VecView(u,viewer2);
PetscViewerDestroy(&viewer2);
file_Counter++;
}


How I am trying to read it (typically serial code with binary called ecg):

   // inputs are Petsc binary files, create and destroy viewer at each file for 
simplicity.
PetscViewer viewer_in;
sprintf(str,"my_3d%d.bin",file_Counter);
PetscViewerBinaryOpen(PETSC_COMM_WORLD,str,FILE_MODE_READ,&viewer_in);
VecLoad(u,viewer_in);
PetscViewerDestroy(&viewer_in);


Errors I got when I ran ecg:

login3 endo]$ ./ecg
[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Not a vector next in file
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.5, Jan, 01, 2017
[0]PETSC ERROR: ./ecg on a arch-linux2-c-opt named gra-login3 by kharches Tue 
Feb  5 17:43:36 2019
[0]PETSC ERROR: Configure options 
--prefix=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/petsc/3.7.5
 --with-mkl_pardiso=1 
--with-mkl_pardiso-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl
 --with-hdf5=1 
--with-hdf5-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18
 --download-hypre=1 --download-metis=1 --download-triangle=1 
--download-ptscotch=1 --download-superlu_dist=1 --download-ml=1 
--download-superlu=1 --download-prometheus=1 --download-mumps=1 
--download-parmetis=1 --download-suitesparse=1 --download-mumps-shared=0 
--download-ptscotch-shared=0 --download-superlu-shared=0 
--download-superlu_dist-shared=0 --download-parmetis-shared=0 
--download-metis-shared=0 --download-ml-shared=0 
--download-suitesparse-shared=0 --download-hypre-shared=0 
--download-prometheus-shared=0 --with-cc=mpicc --with-cxx=mpicxx 
--with-c++-support --with-fc=mpifort --CFLAGS="-O2 -xCore-AVX2 -ftz 
-fp-speculation=safe -fp-model source -fPIC" --CXXFLAGS="-O2 -xCore-AVX2 -ftz 
-fp-speculation=safe -fp-model source -fPIC" --FFLAGS="-O2 -xCore-AVX2 -ftz 
-fp-speculation=safe -fp-model source -fPIC" --with-gnu-compilers=0 
--with-mpi=1 --with-build-step-np=8 --with-shared-libraries=1 
--with-debugging=0 --with-pic=1 --with-x=0 --with-windows-graphics=0 
--with-scalapack=1 
--with-scalapack-include=/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/include
 
--with-scalapack-lib="[/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_scalapack_lp64.a,libmkl_blacs_openmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]"
 
--with-blas-lapack-lib="[/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]"
 --with-hdf5=1 
--with-hdf5-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18
 --with-fftw=1 
--with-fftw-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/fftw-mpi/3.3.6
[0]PETSC ERROR: #1 PetscViewerBinaryReadVecHeader_Private() line 28 in 
/dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c
[0]PETSC ERROR: #2 VecLoad_Binary() line 90 in 
/dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c
[0]PETSC ERROR: #3 VecLoad_Default() line 413 in 
/dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c
[0]PETSC ERROR: #4 VecLoad() line 975 in 
/dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/interface/vector.c
[0]PETSC ERROR: #5 VecLoad_Binary_DA() line 931 in 
/dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/dm/impls/da/gr2.c
[0]PETSC ERROR: #6 VecLoad_Default_DA() line 964 in 
/dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/dm/impls/da/gr2.c
[0]PETSC ERROR: #7 VecLoad() line 975 in 
/dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/interface/vector.c

This email is directed in confidence solely to the person named above and may 
contain confidential, privileged or personal health information. Please be 
aware that this email may also be released to members of the public under 
Ontario's Freedom of Information and Protection of Privacy Act if required. 
Review, distribution, or disclosure of this email by anyone other than the 
person(s) for whom it was originally intended is strictly prohibited. If you 
are not an intended recipient, please notify the sender immediately via a 
return email and destroy all copies of the original message. Thank you for your 
cooperation.

Reply via email to