Note that if I do the same build with OpenMPI 1.6.5, it works flawlessly.

Maxime


Le 2014-08-14 08:39, Maxime Boissonneault a écrit :
Hi,
I compiled Charm++ 6.6.0rc3 using
./build charm++ mpi-linux-x86_64 smp --with-production

When compiling the simple example
mpi-linux-x86_64-smp/tests/charm++/simplearrayhello/

I get a segmentation fault that traces back to OpenMPI :
[mboisson@helios-login1 simplearrayhello]$ ./hello
[helios-login1:01813] *** Process received signal ***
[helios-login1:01813] Signal: Segmentation fault (11)
[helios-login1:01813] Signal code: Address not mapped (1)
[helios-login1:01813] Failing at address: 0x30
[helios-login1:01813] [ 0] /lib64/libpthread.so.0[0x381c00f710]
[helios-login1:01813] [ 1] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(+0xf78f8)[0x7f2cd1f6b8f8] [helios-login1:01813] [ 2] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(+0xf8f64)[0x7f2cd1f6cf64] [helios-login1:01813] [ 3] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(ompi_btl_openib_connect_base_select_for_local_port+0xcf)[0x7f2cd1f672af] [helios-login1:01813] [ 4] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(+0xe1ad7)[0x7f2cd1f55ad7] [helios-login1:01813] [ 5] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(mca_btl_base_select+0x168)[0x7f2cd1f4bf28] [helios-login1:01813] [ 6] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(mca_bml_r2_component_init+0x11)[0x7f2cd1f4b851] [helios-login1:01813] [ 7] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(mca_bml_base_init+0x7f)[0x7f2cd1f4a03f] [helios-login1:01813] [ 8] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(+0x1e0d17)[0x7f2cd2054d17] [helios-login1:01813] [ 9] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(mca_pml_base_select+0x3b6)[0x7f2cd20529d6] [helios-login1:01813] [10] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(ompi_mpi_init+0x4e4)[0x7f2cd1ef0c14] [helios-login1:01813] [11] /software-gpu/mpi/openmpi/1.8.1_gcc4.8_cuda6.0.37/lib/libmpi.so.1(MPI_Init_thread+0x15d)[0x7f2cd1f1065d]
[helios-login1:01813] [12] ./hello(LrtsInit+0x72)[0x4fcf02]
[helios-login1:01813] [13] ./hello(ConverseInit+0x70)[0x4ff680]
[helios-login1:01813] [14] ./hello(main+0x27)[0x470767]
[helios-login1:01813] [15] /lib64/libc.so.6(__libc_start_main+0xfd)[0x381bc1ed1d]
[helios-login1:01813] [16] ./hello[0x470b71]


Anyone has a clue how to fix this ?

Thanks,



--
---------------------------------
Maxime Boissonneault
Analyste de calcul - Calcul Québec, Université Laval
Ph. D. en physique

Reply via email to