Hi,

I encountered crashes (segfaults or PETSc errors 76 or 77) when using
hypre with OpenMPI 1.4.3 supplied with Ubuntu Precise. The
recompilation of the whole stack of libraries with OpenMPI 1.6.5 solved
the issue.

Try switching to another preconditioner to check the hypothesis.

Jan


On Wed, 26 Feb 2014 12:15:35 +0100
Heinz Zorn <[email protected]> wrote:

> Hello everybody,
> 
> I have got a problem with the attached code when it is run in
> parrallel using mpirun. The more processes I use, the more often it
> crashes with the message:
> 
> Traceback (most recent call last):
>     File "test.py", line 32, in <module>
>       solver.solve()
> RuntimeError:
> 
> *** 
> -------------------------------------------------------------------------
> *** DOLFIN encountered an error. If you are not able to resolve this
> issue *** using the information listed below, you can ask for help at
> ***
> ***     [email protected]
> ***
> *** Remember to include the error message listed below and, if
> possible, *** include a *minimal* running example to reproduce the
> error. ***
> *** 
> -------------------------------------------------------------------------
> *** Error:   Unable to successfully call PETSc function 'KSPSolve'.
> *** Reason:  PETSc error code is: 76.
> *** Where:   This error was encountered inside 
> /build/buildd/dolfin-1.3.0+dfsg/dolfin/la/PETScKrylovSolver.cpp.
> *** Process: 11
> ***
> *** DOLFIN version: 1.3.0
> *** Git changeset:  unknown
> *** 
> -------------------------------------------------------------------------
> Using only few processes the programm terminates properly, but the 
> results are obviously not correct. The installation is the ppa 
> installation on a compute server running ubuntu 13.10 server. It
> seems that commenting out line 8 and using the Expression to define
> the boundary condition solves the problem.
> 
> Please tell me if any further information is needed or if I should
> post this problem anywhere else.
> 
> Thanks in advance,
> Heinz Zorn
> 

_______________________________________________
fenics mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics

Reply via email to