On 4/15/10 11:27 AM, Roy Stogner wrote:
> What version and configuration of PETSc?

PETSc 3.0.0-p9 running on OS X 10.5 using gcc 4.4.2 and OpenMPI 1.3.3.

PETSc was compiled at optimization level -O3 using --with-debugging=0.

>> However, this is hard to reproduce.
>
> Yes and no. I can't reproduce it with every version of PETSc, but I
> *can* reproduce it with ex21 of libMesh, even in debugging mode. I'd
> been hoping that there was just some problem with one of my PETSc
> builds. But if you're seeing the same bug then odds are it's a bug in
> libMesh and it's just only triggering segfaults in some PETSc builds.

I should have been more explicit.  I always seem to get an error at 
approximately the same point in the computation.  It is usually a 
segfault, but very occasionally is is a memory corruption error which is 
detected by PETSc.  It seems to be random as to whether it is one or the 
other, with the probability of the error being a segfault apparently 
being much larger than the error being a memory corruption error 
detected by PETSc.

>> PS: Tomorrow I am going to work on getting this running with valgrind...
>
> I'd appreciate it. Also if you do manage to reproduce the problem in
> a simple test case, let me know. ex21 is currently hard-coded to use
> a 3D L-shaped domain, the error doesn't occur until after a few
> refinement steps, and by that time the size of the send_list (where
> the problem mostly likely is) is up in the thousands and hard to
> examine manually.

Where is the relevant send_list --- within EquationSystems::reinit()?

This error is cropping up after several refinement steps for me too 
(usually many), but the total number of nodes in the problem is not too 
huge, so it might be a little easier to diagnose.

-- Boyce

------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to