Hello all,

I am recently encountering an issue with MUMPS solver.

The error is coming from the dealii::PETScWrappers::SparseDirectMUMPS class 
 in these lines.


dealii::PETScWrappers::SparseDirectMUMPS
    solver(solver_control, mpi_communicator); 


    solver.solve (system_matrix,
                  locally_owned_solution,
                  system_rhs); // error while solving


The error that is printed is

  ERR: ERROR : NBROWS > NBROWF
  ERR: INODE =       15525
  ERR: NBROW=           1 NBROWF=           0
  ERR: ROW_LIST=           2
application called MPI_Abort(MPI_COMM_WORLD, -99) - process 0
[cli_0]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, -99) - process 0
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
[cli_0]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0


My code uses MPI based parallelism so it uses 
parallel::distributed::triangulation.

I am using deal.ii8.3.0 , petsc 3.6.0 and p4est 1.1 .

Any help will be appreciated.
Thanks.

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to