Hi,
Yes! Sorry that was a silly mistake of mine in the cmake file; I was
pointing to the wrong MPI. It looks like everything is working now.
Thanks for all your help!
Andrew
On Thu, Jun 13, 2013 at 2:20 PM, Paul T. Bauman wrote:
> ORTE is OpenMPI stuff I think. Make sure you're running the s
ORTE is OpenMPI stuff I think. Make sure you're running the same MPI that
you used to build libMesh and PETSc.
On Thu, Jun 13, 2013 at 3:48 PM, Andrew Davis wrote:
> Hi,
>
> Sorry for the delay getting back do you on this. I think I managed to
> solve some of my issues but haven't quite figur
Hi,
Sorry for the delay getting back do you on this. I think I managed to
solve some of my issues but haven't quite figured it out exactly. I think
this is at least a different issue than before.
Previously, I had installed MPI with Homebrew which was apparently
installing it in some way libMes
Andrew Davis writes:
> This does look like some kind of MPI issue. Just to be safe I tried
> reinstalling MPI and then reconfigure/compiling PETSc and libMesh.
Exactly which options are you configuring with? This message makes it
sound like you have an MPI-enabled PETSc and a libMesh configu
Is this the complete error message?
Can you run with -on_error_attach_debugger and send the backtrace
along with petsc's configure.log?
Dmitry.
On Thu, Jun 6, 2013 at 3:19 PM, Roy Stogner wrote:
>
> On Thu, 6 Jun 2013, Andrew Davis wrote:
>
>> I create a libMesh::EquationSystems with a
>> libMes
Hi,
>Is there an error type? This looks like just a stack trace, but
>doesn't PETSc also let you know what assertion or signal killed you?
Yes, I agree this is just a stack trace, but it is the only thing that
prints when the program crashes.
>We've got libMesh with --disable-mpi passing contin
On Thu, 6 Jun 2013, Andrew Davis wrote:
I create a libMesh::EquationSystems with a
libMesh::NonlinearImplicitSystem attached. However, when I run
EquationSystems::Init() I get the runtime error
Is there an error type? This looks like just a stack trace, but
doesn't PETSc also let you know
Thanks! This makes sense. I recompiled PETSc with the option --with-mpi=0
and then installed libMesh with --disable-mpi. I'm no longer having
trouble initializing libMesh! However, I am still not quite up and
running.
I create a libMesh::EquationSystems with a libMesh::NonlinearImplicitSystem
On Thu, 6 Jun 2013, Dmitry Karpeyev wrote:
> It seems to me that --disable-mpi in libMesh is a bad option when
> using PETSc :-)
Actually, our continuous integration tests --disable-mpi using PETSc.
*But*, you have to compile and link against a PETSc build which was
configured to be "uniprocess
It appears that --disable-mpi prevents MPI_Init() from being called by
libMesh, which causes libMesh::COMM_WORLD to be left uninitialized, as
far as I can determine. Since PETSc is being used, PETSC_COMM_WORLD is
set to libMesh::COMM_WORLD and PetscInitialize() is called.
Since MPI_Init() has not b
Hi,
I'm using cmake to generate the Makefile so it is difficult to interpret,
but you are quite right about the PETSc DM issue; I completely forgot to
tell cmake where the PETSc libraries live. I added this to CMakeLists.txt
and re-complied my code and now I get this error (at runtime) when I
ini
On Thu, Jun 6, 2013 at 8:44 AM, Andrew Davis wrote:
> Hi,
>
> I am trying to solve a nonlinear PDE with libMesh. I have installed
> libmesh with petsc enabled and I have written a simple "hello world"
> program that links with the library mesh_dbg, which complies fine.
> However, at runtime I ge
Hi,
I am trying to solve a nonlinear PDE with libMesh. I have installed
libmesh with petsc enabled and I have written a simple "hello world"
program that links with the library mesh_dbg, which complies fine.
However, at runtime I get the message:
dyld: Symbol not found: _DM_CLASSID
Referenced
13 matches
Mail list logo