If you can, 1.3 would certainly be a good step to take. I'm not sure
why 1.2.5 would be behaving this way, though, so it may indeed be
something in the application (perhaps in the info key being passed to
us?) that is the root cause.
Still, if it isn't too much trouble, moving to 1.3 will
Dear Ralph,
Thanks for your reply.
I encountered this problem using openmpi-1.2.5,
on a Opteron cluster with Myrinet-mx. I tried for
compilation of Global Arrays different compilers
(gfortran, intel, pathscale), the result is the same.
As I mentioned in the previous message GA itself works
Not that I've seen. What version of OMPI are you using, and on what
type of machine/environment?
On Jan 21, 2009, at 11:02 AM, Evgeniy Gromov wrote:
Dear OpenMPI users,
I have the following (problem) related to OpenMPI:
I have recently compiled with OPenMPI the new (4-1)
Global Arrays
Dear OpenMPI users,
I have the following (problem) related to OpenMPI:
I have recently compiled with OPenMPI the new (4-1)
Global Arrays package using ARMCI_NETWORK=MPI-SPAWN,
which implies the use of dynamic process management
realised in MPI2. It got compiled and tested successfully.
However