It must be making contact or ORTE wouldn't be attempting to launch your
application's procs. Looks more like it never received the launch command.
Looking at the code, I suspect you're getting caught in a race condition that
causes the message to get "stuck".
Just to see if that's the case, you
Hi Ralph
> > some weeks ago (mainly in the beginning of October) I reported
> > several problems and I would be grateful if you can tell me if
> > and probably when somebody will try to solve them.
> >
> > 1) I don't get the expected results, when I try to send or scatter
> > the columns of a m
Note that exporting the LD_LIBRARY_PATH on the mpirun command line does not
necessarily apply to launching the remote orteds (it applies to launching the
remote MPI processes, which are children of the orteds).
Since you're using ssh, you might want to check the shell startup scripts on
the tar
On Dec 14, 2012, at 4:31 PM, Handerson, Steven wrote:
> I’m trying to track down an instance of openMPI writing to a freed block of
> memory.
> This occurs with the most recent release (1.6.3) as well as 1.6, on a 64 bit
> intel architecture, fedora 14.
> It occurs with a very simple reduction (
Greetings Siegmar; sorry for the horrid delay in replying. :-(
Ralph opened a ticket about this a while ago
(https://svn.open-mpi.org/trac/ompi/ticket/3351). I answered it this morning
-- see the ticket for the details.
Short version: I don't think that your program is correct.
On Oct 11, 2
On Dec 15, 2012, at 4:41 AM, Siegmar Gross wrote:
>>> 2) I don't get the expected result, when I try to scatter an object
>>> in Java.
>>> https://svn.open-mpi.org/trac/ompi/ticket/3351
>
> Do you have an idea when somebody will have time to fix these problems?
Sorry for the horrid delay. :-(
Hello,
> #3351: JAVA scatter error
> -+-
> Reporter: rhc | Owner: jsquyres
> Type: defect| Status: closed
> Priority: critical | Milestone: Open MPI 1.7.1
> Version: trunk | Resolution: invalid
> Keywords:
Hmmm...you shouldn't need to specify a hostfile in addition to the rankfile, so
something has gotten messed up in the allocator. I'll take a look at it.
As for cpus-per-proc, I'm hoping to tackle it over the holiday while I take a
break from my regular job. Will let you know when fixed.
Thanks
Dear community
I get a segfault in the small Fortran program that is attached. I use
one-sided communication and derived datatypes.
I tried it with different version of Open MPI. Versions 1.4.2 and 1.4.5
work, but with 1.6.1 and 1.6.3 it crashes.
Can anybody confirm this?
Many thanks
Steph