Hello Jean,

Thanks for the reply.
I understand your concern for the lack of proper information but this is 
all I have.

The error occurs after around 900 iterations of solving 5 coupled equations 
so it is almost impossible to create a minimal working example as a demo... 
:)

I will try installing all the libraries again. (Petsc, P4est, Deal.ii)

Can you help me with installing Petsc instead  ?
Of the 2 below mentioned commands, which one should be used to install 
Petsc ? 

*Command 1:*

*./configure  --with-x=0 --download-hypre=1 --with-cc=gcc --with-cxx=g++ 
--with-fc=gfortran  --download-mpich --download-scalapack 
--download-parmetis --download-metis --download-mumps*

*Command 2:*
*./config/configure.py --with-shared=1 --with-x=0 --with-mpi=1 
--download-hypre=1 --with-cc=gcc --with-cxx=g++ --download-f2cblaslapack 
--download-mpich --download-scalapack --download-mumps --download-parmetis 
--download-metis*


On Tuesday, July 12, 2016 at 1:08:37 AM UTC-4, Jean-Paul Pelteret wrote:
>
> Dear Rajat,
>
> Unfortunately your question does not provide sufficient information for 
> anyone to be able to help you. Although the error is in the solver, the 
> source of the error could come from one of several places. Please see this 
> post <https://groups.google.com/forum/#!topic/dealii/GRZMUTLIm2I> for 
> more information on this point. It would be best to provide a minimal 
> working example that replicates the issue. Also, note that you're using 
> quite an old version of deal.II, and the root of the problem may have 
> already been fixed in a later version.
>
> Regards,
> Jean-Paul
>
> On Tuesday, July 12, 2016 at 3:38:54 AM UTC+2, RAJAT ARORA wrote:
>>
>> Hello all,
>>
>> I am recently encountering an issue with MUMPS solver.
>>
>> The error is coming from the dealii::PETScWrappers::SparseDirectMUMPS 
>> class  in these lines.
>>
>>
>> dealii::PETScWrappers::SparseDirectMUMPS
>>     solver(solver_control, mpi_communicator); 
>>
>>
>>     solver.solve (system_matrix,
>>                   locally_owned_solution,
>>                   system_rhs); // error while solving
>>
>>
>> The error that is printed is
>>
>>   ERR: ERROR : NBROWS > NBROWF
>>   ERR: INODE =       15525
>>   ERR: NBROW=           1 NBROWF=           0
>>   ERR: ROW_LIST=           2
>> application called MPI_Abort(MPI_COMM_WORLD, -99) - process 0
>> [cli_0]: aborting job:
>> application called MPI_Abort(MPI_COMM_WORLD, -99) - process 0
>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>> [cli_0]: aborting job:
>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>>
>>
>> My code uses MPI based parallelism so it uses 
>> parallel::distributed::triangulation.
>>
>> I am using deal.ii8.3.0 , petsc 3.6.0 and p4est 1.1 .
>>
>> Any help will be appreciated.
>> Thanks.
>>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to