There is a slightly newer version available, 8.2.1c at http://www.oracle.com/goto/ompt

You should be able to install side by side without interfering with a previously installed version.

If that does not alleviate the issue additional information as Scott asked would be useful. The full mpirun line or list of mca parameters that were set, number of processes, number of nodes, version of Solaris,
version of compiler, what interconnect?

If that does not shed some light then maybe a small test case would be the next step.

-DON

On 07/15/10 09:56, Scott Atchley wrote:
Lydia,

Which interconnect is this running over?

Scott

On Jul 15, 2010, at 5:19 AM, Lydia Heck wrote:

We are running Sun's build of Open Mpi  1.3.3r21324-ct8.2-b09b-r31
(HPC8.2) and one code that runs perfectly fine under
HPC8.1 (Open MPI) 1.3r19845-ct8.1-b06b-r21 and before fails with



[oberon:08454] *** Process received signal ***
[oberon:08454] Signal: Segmentation Fault (11)
[oberon:08454] Signal code: Address not mapped (1)
[oberon:08454] Failing at address: 0
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libopen-pal.so.0.0.0:0x4b89e
/lib/amd64/libc.so.1:0xd0f36
/lib/amd64/libc.so.1:0xc5a72
0x0 [ Signal 11 (SEGV)]
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libmpi.so.0.0.0:MPI_Alloc_mem+0x7f
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libmpi.so.0.0.0:MPI_Sendrecv_replace+0x31e
/opt/SUNWhpc/HPC8.2/sun/lib/amd64/libmpi_f77.so.0.0.0:PMPI_SENDRECV_REPLACE+0x94
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:mpi_cyclic_transfer_+0xd9
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:cycle_particles_and_interpolate_+0x94b
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:interpolate_field_+0xc30
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:MAIN_+0xe68
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:main+0x3d
/home/arj/code_devel/ic_gen_2lpt_v3.5/comp_disp.x:0x62ac
[oberon:08454] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 8454 on node oberon exited on 
signal 11 (Segmentation Fault).



I have not tried to get and build a newer Open Mpi, so I do not know if the 
problem propagates into the more recent versions.


If the developers are interested, I could ask the user to prepare the code for 
you to have a look at the problem which looks like to be in  MPI_Alloc_mem.

Best wishes,
Lydia Heck

------------------------------------------
Dr E L  Heck

University of Durham Institute for Computational Cosmology
Ogden Centre
Department of Physics South Road

DURHAM, DH1 3LE United Kingdom

e-mail: lydia.h...@durham.ac.uk

Tel.: + 44 191 - 334 3628
Fax.: + 44 191 - 334 3645
___________________________________________

_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users


_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users

Reply via email to