Re: [OMPI devel] Error using MPI_Pack_external / MPI_Unpack_external

2016-02-10 Thread Ralph Castain
Out of curiosity: if both systems are Intel, they why are you enabling hetero? You don’t need it in that scenario. Admittedly, we do need to fix the bug - just trying to understand why you are configuring that way. > On Feb 10, 2016, at 8:46 PM, Michael Rezny wrote: > > Hi Gilles, > I can co

Re: [OMPI devel] Error using MPI_Pack_external / MPI_Unpack_external

2016-02-10 Thread Michael Rezny
Hi Gilles, I can confirm that with a fresh download and build from source for OpenMPI 1.10.2 with --enable-heterogeneous the unpacked ints are the wrong endian. However, without --enable-heterogeneous, the unpacked ints are correct. So, this problem still exists in heterogeneous builds with OpenM

Re: [OMPI devel] Error using MPI_Pack_external / MPI_Unpack_external

2016-02-10 Thread Michael Rezny
Hi Gilles, thanks for the prompt response and assistance. Both systems use Intel CPUs. The problem originally comes from a coupler, yac, used in climate science. There are several reported instances where the coupling tests fail. The problem occurs often enough to incorporate a workaround which

Re: [OMPI devel] Error using MPI_Pack_external / MPI_Unpack_external

2016-02-10 Thread Gilles Gouaillardet
Michael, does your two systems have the same endianness ? do you know how openmpi was configure'd on both systems ? (is --enable-heterogeneous enabled or disabled on both systems ?) fwiw, openmpi 1.6.5 is old now and no more maintained. I strongly encourage you to use openmpi 1.10.2 Cheers, Gi

[OMPI devel] Error using MPI_Pack_external / MPI_Unpack_external

2016-02-10 Thread Michael Rezny
Hi, I am running Ubuntu 14.04 LTS with OpenMPI 1.6.5 and gcc 4.8.4 On a single rank program which just packs and unpacks two ints using MPI_Pack_external and MPI_Unpack_external the unpacked ints are in the wrong endian order. However, on a HPC, (not Ubuntu), using OpenMPI 1.6.5 and gcc 4.8.4 the