Re: [OMPI users] MPI-I/O issues

2014-08-06 Thread Rob Latham
On 08/06/2014 11:50 AM, Mohamad Chaarawi wrote: To replicate, run the program with 2 or more procs: mpirun -np 2 ./hindexed_io mpi_test_file [jam:15566] *** Process received signal *** [jam:15566] Signal: Segmentation fault (11) [jam:15566] Signal code: Address not mapped (1) [jam:15566] Fai

Re: [OMPI users] MPI-I/O issues

2014-08-06 Thread Mohamad Chaarawi
Hi Rob, On 8/6/2014 1:32 PM, Rob Latham wrote: On 08/06/2014 11:50 AM, Mohamad Chaarawi wrote: If I use mpich 3.1.2 , I don't see those issues. What Mohamad forgot to tell you is that he doesn't see those issues because I patched them on monday. If I understood right, your patch was to

Re: [OMPI users] MPI-I/O issues

2014-08-06 Thread Rob Latham
On 08/06/2014 11:50 AM, Mohamad Chaarawi wrote: If I use mpich 3.1.2 , I don't see those issues. What Mohamad forgot to tell you is that he doesn't see those issues because I patched them on monday. ROMIO has some HINDEXED_BLOCK fixes that OMPI should pick up on the next romio resync. Y

[OMPI users] MPI-I/O issues

2014-08-06 Thread Mohamad Chaarawi
Hi all, I'm seeing some problems with dervided datatype construction and I/O with OpenMPI 1.8.1. I have replicated them in the attached program. The first issue is that MPI_Type_create_hindexed_block() always sefgaults. Usage of this routine is commented out in the program. (I have a separat

Re: [OMPI users] openmpi 1.8.1 error witg gfortran

2014-08-06 Thread Syed Ahsan Ali
Issue resolved. On Wed, Aug 6, 2014 at 2:48 PM, Syed Ahsan Ali wrote: > I have following error while compiling > > > *** Fortran compiler > checking whether we are using the GNU Fortran compiler... yes > checking whether /opt/gcc-4.9.1/bin/gfortran accepts -g... yes > configure: WARNING: Open M