在 2017年6月6日星期二 UTC-7上午2:13:28,Weixiong Zheng写道:
>
> Dear All,
>
> I am doing a radiation transport problem, which has multiple individual 
> equations (number determined by the number of directions we input). As the 
> number of directions is arbitrary, I put the sparsematries are contained in 
> std::vectors as raw pointers (using LA namespace to stand for PETScWrappers)
>
std::vector<LA::MPI::SparseMatrix*> sys_mats;
> When initializing:
> for (unsigned int i=0; i<n_dir; ++i)
> {
>
Sorry for the typo, I actually meant for
sys_mats.push_back (new LA::MPI::SparseMatrix);
 

>
>   sys_mats.push_back (new LA::MPI::SparseMatrix*);
>   sys_mats[i]->reinit (locally_owned_dofs,
>                                                   locally_owned_dofs,
>                                                   dsp,
>                                                   mpi_communicator);
> }
>
> When assembling:
> for (; cell!=endc; ++cell)
> {
>   if (cell->is_locally_owned())
>   {
>     std::vector<FullMatrix<double> > local_mats
>     // do local matrix and boundary term assembly for all directions in 
> one cell
>     cell->get_dof_indices (local_dof_indices);
>
>     // Map local matrices to global matrices for all directions
>     for (unsigned int i=0; i<n_dir; ++i)
>     {
>       sys_mats[i]->add (local_dof_indices,
>                                                     local_mats [ i ]);
>     }
>   }
> }
>
> Then we compress:
> for (unsigned int i=0; i<n_dir; ++i)
> {
>   sys_mats[i]->compress (VectorOperation::add);
>   pcout << "we have compressed Dir " << i << std::endl;
> }
>
> Now, I got the error: compressing can go through direction 0 but not 1 
> with the error at the end of this post.
> Any ideas how to fix it?
>
> Thanks in advance!
> Weixiong
>
> [0]PETSC ERROR: --------------------- Error Message 
> --------------------------------------------------------------
>
> [0]PETSC ERROR: Argument out of range
>
> [0]PETSC ERROR: Inserting a new nonzero at global row/column (6, 15) into 
> matrix
>
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
> for trouble shooting.
>
> [0]PETSC ERROR: Petsc Release Version 3.6.3, unknown 
>
>
> [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 613 in 
> /usr/local/src/d2-parallel-package/petsc/src/mat/impls/aij/mpi/mpiaij.c
>
> [0]PETSC ERROR: #2 MatAssemblyEnd_MPIAIJ() line 731 in 
> /usr/local/src/d2-parallel-package/petsc/src/mat/impls/aij/mpi/mpiaij.c
>
> [0]PETSC ERROR: #3 MatAssemblyEnd() line 5098 in 
> /usr/local/src/d2-parallel-package/petsc/src/mat/interface/matrix.c
>
> ERROR: Uncaught exception in MPI_InitFinalize on proc 0. Skipping 
> MPI_Finalize() to avoid a deadlock.
>
> ----------------------------------------------------
>
> Exception on processing: 
>
>
> --------------------------------------------------------
>
> An error occurred in line <263> of file 
> <../source/lac/petsc_matrix_base.cc> in function
>
>     void dealii::PETScWrappers::MatrixBase::compress(const 
> VectorOperation::values)
>
> The violated condition was: 
>
>     ierr == 0
>
> The name and call sequence of the exception was:
>
>     ExcPETScError(ierr)
>
> Additional Information: 
>
> An error with error number 63 occurred while calling a PETSc function
>
> --------------------------------------------------------
>
>
> Aborting!
>
> ----------------------------------------------------
>
> -------------------------------------------------------
>
> Primary job  terminated normally, but 1 process returned
>
> a non-zero exit code.. Per user-direction, the job has been aborted.
>
> -------------------------------------------------------
>
> --------------------------------------------------------------------------
>
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD 
>
> with errorcode 59.
>
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>
> You may or may not see output from other processes, depending on
>
> exactly when Open MPI kills them.
>
> --------------------------------------------------------------------------
>
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
>
> [1]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the 
> batch system) has told this process to end
>
> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>
> [1]PETSC ERROR: or see 
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>
> [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS 
> X to find memory corruption errors
>
> [1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and 
> run 
>
> [1]PETSC ERROR: to get more information on the crash.
>
> [1]PETSC ERROR: --------------------- Error Message 
> --------------------------------------------------------------
>
> [1]PETSC ERROR: Signal received
>
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
> for trouble shooting.
>
> [1]PETSC ERROR: Petsc Release Version 3.6.3, unknown 
>
> [1]PETSC ERROR: dg-ep-proto on a uly named my_hostname by GrillCheese Tue 
> Jun  6 01:18:23 2017
>
> [1]PETSC ERROR: Configure options --with-make-np=8 --with-debugging=0 
> --prefix=/Applications/deal.II.app/Contents/Resources/opt/petsc-3e25e16 
> --with-mpi-dir=/Applications/deal.II.app/Contents/Resources/opt/openmpi-1.10.2
>  
> --with-sundials-dir=/Applications/deal.II.app/Contents/Resources/opt/sundials-6b7e4b6
>  
> --with-shared-libraries 
> --with-external-packages-dir=/Users/heltai/d2-parallel-package/petsc-external 
> --download-parmetis --download-metis --download-hypre --download-mumps 
> --download-scalapack --download-superlu --download-superlu_dist 
> --download-hdf5
>
> [1]PETSC ERROR: #1 User provided function() line 0 in  unknown file
>
> --------------------------------------------------------------------------
>
> mpirun detected that one or more processes exited with non-zero status, 
> thus causing
>
> the job to be terminated. The first process to do so was:
>
>
>   Process name: [[50193,1],0]
>
>   Exit code:    1
>
> --------------------------------------------------------------------------
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to