Yes, always use the binary file

> On Sep 27, 2016, at 3:13 PM, Manuel Valera <[email protected]> wrote:
> 
> Barry, thanks for your insight,
> 
> This standalone script must be translated into a much bigger model, which 
> uses AIJ matrices to define the laplacian in the form of the 3 usual arrays, 
> the ascii files in the script take the place of the arrays which are passed 
> to the solving routine in the model.
> 
> So, can i use the approach you mention to create the MPIAIJ from the petsc 
> binary file ? would this be a better solution than reading the three arrays 
> directly? In the model, even the smallest matrix is 10^5x10^5 elements
> 
> Thanks.
> 
> 
> On Tue, Sep 27, 2016 at 12:53 PM, Barry Smith <[email protected]> wrote:
> 
>   Are you loading a matrix from an ASCII file? If so don't do that. You 
> should write a simple sequential PETSc program that reads in the ASCII file 
> and saves the matrix as a PETSc binary file with MatView(). Then write your 
> parallel code that reads in the binary file with MatLoad() and solves the 
> system. You can read in the right hand side from ASCII and save it in the 
> binary file also. Trying to read an ASCII file in parallel and set it into a 
> PETSc parallel matrix is just a totally thankless task that is unnecessary.
> 
>    Barry
> 
> > On Sep 26, 2016, at 6:40 PM, Manuel Valera <[email protected]> wrote:
> >
> > Ok, last output was from simulated multicores, in an actual cluster the 
> > errors are of the kind:
> >
> > [valera@cinci CSRMatrix]$ petsc -n 2 ./solvelinearmgPETSc
> >  TrivSoln loaded, size:            4 /           4
> >  TrivSoln loaded, size:            4 /           4
> >  RHS loaded, size:            4 /           4
> >  RHS loaded, size:            4 /           4
> > [0]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [0]PETSC ERROR: Argument out of range
> > [0]PETSC ERROR: Comm must be of size 1
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [0]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [1]PETSC ERROR: Argument out of range
> > [1]PETSC ERROR: Comm must be of size 1
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [1]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [1]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > [1]PETSC ERROR: #2 MatSetType() line 94 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c
> > [1]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> >  local size:           2
> >  local size:           2
> > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran 
> > --download-fblaslapack=1 --download-mpich
> > [0]PETSC ERROR: #1 MatCreate_SeqAIJ() line 3958 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: #2 MatSetType() line 94 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c
> > [0]PETSC ERROR: #3 MatCreateSeqAIJWithArrays() line 4300 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [1]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [1]PETSC ERROR: [0]PETSC ERROR: Nonconforming object sizes
> > [0]PETSC ERROR: Sum of local lengths 8 does not equal global length 4, my 
> > local length 4
> >   likely a call to VecSetSizes() or MatSetSizes() is wrong.
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > Nonconforming object sizes
> > [1]PETSC ERROR: Sum of local lengths 8 does not equal global length 4, my 
> > local length 4
> >   likely a call to VecSetSizes() or MatSetSizes() is wrong.
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [0]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [1]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [0]PETSC ERROR: #4 PetscSplitOwnership() line 93 in 
> > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [1]PETSC ERROR: #4 PetscSplitOwnership() line 93 in 
> > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > [0]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in 
> > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
> > [0]PETSC ERROR: #6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: #5 PetscLayoutSetUp() line 143 in 
> > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
> > [1]PETSC ERROR: [0]PETSC ERROR: #7 MatMPIAIJSetPreallocation() line 3505 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > #6 MatMPIAIJSetPreallocation_MPIAIJ() line 2768 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: [0]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > #7 MatMPIAIJSetPreallocation() line 3505 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: #8 MatSetUp_MPIAIJ() line 2153 in 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [0]PETSC ERROR: #9 MatSetUp() line 739 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [1]PETSC ERROR: #9 MatSetUp() line 739 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [0]PETSC ERROR: Object is in wrong state
> > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on 
> > argument 1 "mat" before MatSetNearNullSpace()
> > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [0]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > Object is in wrong state
> > [1]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on 
> > argument 1 "mat" before MatSetNearNullSpace()
> > [1]PETSC ERROR: [0]PETSC ERROR: Configure options --with-cc=gcc 
> > --with-cxx=g++ --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [0]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble 
> > shooting.
> > [1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [1]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [1]PETSC ERROR: #10 MatSetNearNullSpace() line 8195 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [0]PETSC ERROR: Object is in wrong state
> > [1]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on 
> > argument 1 "mat" before MatAssemblyBegin()
> > [0]PETSC ERROR: [1]PETSC ERROR: Object is in wrong state
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [0]PETSC ERROR: Must call MatXXXSetPreallocation() or MatSetUp() on 
> > argument 1 "mat" before MatAssemblyBegin()
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [1]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [1]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran 
> > --download-fblaslapack=1 --download-mpich
> > [1]PETSC ERROR: #11 MatAssemblyBegin() line 5093 in 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0]PETSC ERROR: 
> > ------------------------------------------------------------------------
> > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> > probably memory access out of range
> > [1]PETSC ERROR: 
> > ------------------------------------------------------------------------
> > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> > probably memory access out of range
> > [1]PETSC ERROR: [0]PETSC ERROR: Try option -start_in_debugger or 
> > -on_error_attach_debugger
> > [0]PETSC ERROR: or see 
> > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> > [1]PETSC ERROR: or see 
> > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> > [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X 
> > to find memory corruption errors
> > or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory 
> > corruption errors
> > [0]PETSC ERROR: likely location of problem given in stack below
> > [0]PETSC ERROR: ---------------------  Stack Frames 
> > ------------------------------------
> > [1]PETSC ERROR: likely location of problem given in stack below
> > [1]PETSC ERROR: ---------------------  Stack Frames 
> > ------------------------------------
> > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> > [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> > [0]PETSC ERROR: [1]PETSC ERROR: Note: The EXACT line numbers in the stack 
> > are not available,
> > [1]PETSC ERROR:       INSTEAD the line number of the start of the function
> >       is given.
> > [0]PETSC ERROR: [0] MatAssemblyEnd line 5185 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [1]PETSC ERROR:       is given.
> > [1]PETSC ERROR: [1] MatAssemblyEnd line 5185 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0] MatAssemblyBegin line 5090 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [0] MatSetNearNullSpace line 8191 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [1]PETSC ERROR: [1] MatAssemblyBegin line 5090 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [1]PETSC ERROR: [0] PetscSplitOwnership line 80 
> > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > [0]PETSC ERROR: [0] PetscLayoutSetUp line 129 
> > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
> > [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation_MPIAIJ line 2767 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1] MatSetNearNullSpace line 8191 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [1]PETSC ERROR: [1] PetscSplitOwnership line 80 
> > /home/valera/petsc-3.7.2/src/sys/utils/psplit.c
> > [1]PETSC ERROR: [0]PETSC ERROR: [0] MatMPIAIJSetPreallocation line 3502 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [0]PETSC ERROR: [0] MatSetUp_MPIAIJ line 2152 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1] PetscLayoutSetUp line 129 
> > /home/valera/petsc-3.7.2/src/vec/is/utils/pmap.c
> > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation_MPIAIJ line 2767 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [0]PETSC ERROR: [0] MatSetUp line 727 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [0] MatCreate_SeqAIJ line 3956 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > [1]PETSC ERROR: [1] MatMPIAIJSetPreallocation line 3502 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1]PETSC ERROR: [1] MatSetUp_MPIAIJ line 2152 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [0]PETSC ERROR: [0] MatSetType line 44 
> > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c
> > [0]PETSC ERROR: [0] MatCreateSeqAIJWithArrays line 4295 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > [1]PETSC ERROR: [1] MatSetUp line 727 
> > /home/valera/petsc-3.7.2/src/mat/interface/matrix.c
> > [1]PETSC ERROR: [1] MatCreate_SeqAIJ line 3956 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [0]PETSC ERROR: Signal received
> > [1]PETSC ERROR: [1] MatSetType line 44 
> > /home/valera/petsc-3.7.2/src/mat/interface/matreg.c
> > [1]PETSC ERROR: [1] MatCreateSeqAIJWithArrays line 4295 
> > /home/valera/petsc-3.7.2/src/mat/impls/aij/seq/aij.c
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
> > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message 
> > --------------------------------------------------------------
> > [1]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [0]PETSC ERROR: Signal received
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> > trouble shooting.
> > [1]PETSC ERROR: #12 User provided function() line 0 in  unknown file
> > Petsc Release Version 3.7.2, Jun, 05, 2016
> > [1]PETSC ERROR: ./solvelinearmgPETSc                                        
> >                                                                             
> >                                                                             
> >                                             P on a arch-linux2-c-debug 
> > named cinci by valera Mon Sep 26 16:39:02 2016
> > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > --with-fc=gfortran --download-fblaslapack=1 --download-mpich
> > [1]PETSC ERROR: #12 User provided function() line 0 in  unknown file
> > application called MPI_Abort(comm=0x84000004, 59) - process 0
> > [cli_0]: aborting job:
> > application called MPI_Abort(comm=0x84000004, 59) - process 0
> > application called MPI_Abort(comm=0x84000002, 59) - process 1
> > [cli_1]: aborting job:
> > application called MPI_Abort(comm=0x84000002, 59) - process 1
> >
> > ===================================================================================
> > =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> > =   PID 10266 RUNNING AT cinci
> > =   EXIT CODE: 59
> > =   CLEANING UP REMAINING PROCESSES
> > =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> > ===================================================================================
> >
> >
> > On Mon, Sep 26, 2016 at 3:51 PM, Manuel Valera <[email protected]> 
> > wrote:
> > Ok, i created a tiny testcase just for this,
> >
> > The output from n# calls are as follows:
> >
> > n1:
> > Mat Object: 1 MPI processes
> >   type: mpiaij
> > row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > row 1: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> > row 2: (0, 4.)  (1, 3.)  (2, 1.)  (3, 2.)
> > row 3: (0, 3.)  (1, 4.)  (2, 2.)  (3, 1.)
> >
> > n2:
> > Mat Object: 2 MPI processes
> >   type: mpiaij
> > row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > row 1: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> > row 2: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > row 3: (0, 2.)  (1, 1.)  (2, 3.)  (3, 4.)
> >
> > n4:
> > Mat Object: 4 MPI processes
> >   type: mpiaij
> > row 0: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > row 1: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > row 2: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> > row 3: (0, 1.)  (1, 2.)  (2, 4.)  (3, 3.)
> >
> >
> >
> > It really gets messed, no idea what's happening.
> >
> >
> >
> >
> > On Mon, Sep 26, 2016 at 3:12 PM, Barry Smith <[email protected]> wrote:
> >
> > > On Sep 26, 2016, at 5:07 PM, Manuel Valera <[email protected]> wrote:
> > >
> > > Ok i was using a big matrix before, from a smaller testcase i got the 
> > > output and effectively, it looks like is not well read at all, results 
> > > are attached for DRAW viewer, output is too big to use STDOUT even in the 
> > > small testcase. n# is the number of processors requested.
> >
> >    You need to construct a very small test case so you can determine why 
> > the values do not end up where you expect them. There is no way around it.
> > >
> > > is there a way to create the matrix in one node and the distribute it as 
> > > needed on the rest ? maybe that would work.
> >
> >    No the is not scalable. You become limited by the memory of the one node.
> >
> > >
> > > Thanks
> > >
> > > On Mon, Sep 26, 2016 at 2:40 PM, Barry Smith <[email protected]> wrote:
> > >
> > >     How large is the matrix? It will take a very long time if the matrix 
> > > is large. Debug with a very small matrix.
> > >
> > >   Barry
> > >
> > > > On Sep 26, 2016, at 4:34 PM, Manuel Valera <[email protected]> 
> > > > wrote:
> > > >
> > > > Indeed there is something wrong with that call, it hangs out 
> > > > indefinitely showing only:
> > > >
> > > >  Mat Object: 1 MPI processes
> > > >   type: mpiaij
> > > >
> > > > It draws my attention that this program works for 1 processor but not 
> > > > more, but it doesnt show anything for that viewer in either case.
> > > >
> > > > Thanks for the insight on the redundant calls, this is not very clear 
> > > > on documentation, which calls are included in others.
> > > >
> > > >
> > > >
> > > > On Mon, Sep 26, 2016 at 2:02 PM, Barry Smith <[email protected]> wrote:
> > > >
> > > >    The call to MatCreateMPIAIJWithArrays() is likely interpreting the 
> > > > values you pass in different than you expect.
> > > >
> > > >     Put a call to MatView(Ap,PETSC_VIEWER_STDOUT_WORLD,ierr)  after the 
> > > > MatCreateMPIAIJWithArray() to see what PETSc thinks the matrix is.
> > > >
> > > >
> > > > > On Sep 26, 2016, at 3:42 PM, Manuel Valera <[email protected]> 
> > > > > wrote:
> > > > >
> > > > > Hello,
> > > > >
> > > > > I'm working on solve a linear system in parallel, following ex12 of 
> > > > > the ksp tutorial i don't see major complication on doing so, so for a 
> > > > > working linear system solver with PCJACOBI and KSPGCR i did only the 
> > > > > following changes:
> > > > >
> > > > >    call MatCreate(PETSC_COMM_WORLD,Ap,ierr)
> > > > > !  call MatSetType(Ap,MATSEQAIJ,ierr)
> > > > >   call MatSetType(Ap,MATMPIAIJ,ierr) !paralellization
> > > > >
> > > > >   call MatSetSizes(Ap,PETSC_DECIDE,PETSC_DECIDE,nbdp,nbdp,ierr);
> > > > >
> > > > > !  call MatSeqAIJSetPreallocationCSR(Ap,iapi,japi,app,ierr)
> > > > >   call MatSetFromOptions(Ap,ierr)
> > > >
> > > >     Note that none of the lines above are needed (or do anything) 
> > > > because the MatCreateMPIAIJWithArrays() creates the matrix from scratch 
> > > > itself.
> > > >
> > > >    Barry
> > > >
> > > > > !  call 
> > > > > MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD,nbdp,nbdp,iapi,japi,app,Ap,ierr)
> > > > >  call 
> > > > > MatCreateMPIAIJWithArrays(PETSC_COMM_WORLD,floor(real(nbdp)/sizel),PETSC_DECIDE,nbdp,nbdp,iapi,japi,app,Ap,ierr)
> > > > >
> > > > >
> > > > > I grayed out the changes from sequential implementation.
> > > > >
> > > > > So, it does not complain at runtime until it reaches KSPSolve(), with 
> > > > > the following error:
> > > > >
> > > > >
> > > > > [1]PETSC ERROR: --------------------- Error Message 
> > > > > --------------------------------------------------------------
> > > > > [1]PETSC ERROR: Object is in wrong state
> > > > > [1]PETSC ERROR: Matrix is missing diagonal entry 0
> > > > > [1]PETSC ERROR: See 
> > > > > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble 
> > > > > shooting.
> > > > > [1]PETSC ERROR: Petsc Release Version 3.7.3, unknown
> > > > > [1]PETSC ERROR: ./solvelinearmgPETSc                                  
> > > > >                                                                       
> > > > >                                                                       
> > > > >                                                               � � on 
> > > > > a arch-linux2-c-debug named valera-HP-xw4600-Workstation by valera 
> > > > > Mon Sep 26 13:35:15 2016
> > > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> > > > > --with-fc=gfortran --download-fblaslapack=1 --download-mpich=1 
> > > > > --download-ml=1
> > > > > [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1733 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/mat/impls/aij/seq/aijfact.c
> > > > > [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6579 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/mat/interface/matrix.c
> > > > > [1]PETSC ERROR: #3 PCSetUp_ILU() line 212 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/factor/ilu/ilu.c
> > > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c
> > > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
> > > > > [1]PETSC ERROR: #6 PCSetUpOnBlocks_BJacobi_Singleblock() line 650 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/impls/bjacobi/bjacobi.c
> > > > > [1]PETSC ERROR: #7 PCSetUpOnBlocks() line 1001 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/pc/interface/precon.c
> > > > > [1]PETSC ERROR: #8 KSPSetUpOnBlocks() line 220 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
> > > > > [1]PETSC ERROR: #9 KSPSolve() line 600 in 
> > > > > /home/valera/v5PETSc/petsc/petsc/src/ksp/ksp/interface/itfunc.c
> > > > > At line 333 of file solvelinearmgPETSc.f90
> > > > > Fortran runtime error: Array bound mismatch for dimension 1 of array 
> > > > > 'sol' (213120/106560)
> > > > >
> > > > >
> > > > > This code works for -n 1 cores, but it gives this error when using 
> > > > > more than one core.
> > > > >
> > > > > What am i missing?
> > > > >
> > > > > Regards,
> > > > >
> > > > > Manuel.
> > > > >
> > > > > <solvelinearmgPETSc.f90>
> > > >
> > > >
> > >
> > >
> > > <n4.png><n2.png><n1.png>
> >
> >
> >
> > <rhss.txt><solvelinearmgPETSc.f90><as.txt><ias.txt><jas.txt>
> 
> 

Reply via email to