Hi, Gus,

Automatic allocation (an reallocation) on assignment is among the nifty
features of Fortran 2003. In this case "conc" is automatically allocated so
to match the shape of its initialiser array "[ xx, yy ]". Note that "xx" and
"yy" are not allocatable though their derived type has an allocatable
element.

Kind regards,
Hristo Iliev

> -----Original Message-----
> From: users-boun...@open-mpi.org [mailto:users-boun...@open-mpi.org]
> On Behalf Of Gus Correa
> Sent: Friday, January 11, 2013 7:19 PM
> To: Open MPI Users
> Subject: Re: [OMPI users] Initializing OMPI with invoking the array
> constructor on Fortran derived types causes the executable to crash
> 
> Hi Stefan
> 
> Don't you need to allocate xx, yy and conc, before you use them?
> In the short program below, they are declared as allocatable, but not
actually
> allocated.
> 
> I hope this helps,
> Gus Correa
> 
> On 01/11/2013 09:58 AM, Stefan Mauerberger wrote:
> > Dear Paul!
> >
> > Thanks for your reply. This problem seems to get complicated.
> >
> > Unfortunately, I can not reproduce what you are describing. I tried
> > with some GCCs as 4.7.1, 4.7.2 and 4.8.0 (20121008). As you suggested,
> > replacing the MPI_Init and MPI_Finalize calls with WRITE(*,*) "foooo"
> > and commenting out use mpi, everything is just fine. No segfault no
> > core dump, just the result as I expect it (I put a write(*,*)
> > size(conc) in, which must print 2). I simply compiled with a bare
> > mpif90 ... and executed typing mpirun -np 1 ./a.out .
> > I also tried on three different architectures - all 64-bit - and, as
> > soon as MPI_Init is invoked, the program gets core dumped.
> >
> > I also tried with IBM's MPI implementation just with the difference
> > using include 'mpif.h' instead of use mpi. Everything is fine and the
> > result is as in serial runs.
> >
> > Well, it's not surprising that 4.4.x has its problems. Using modern
> > Fortran as F03, GCC in a version younger than  4.7.x is just mandatory.
> >
> > Cheers,
> > Stefan
> >
> >
> >
> > On Fri, 2013-01-11 at 14:26 +0100, Paul Kapinos wrote:
> >> This is hardly an Open MPI issue:
> >>
> >> switch the calls to MPI_Init, MPI_Finalize against
> >> WRITE(*,*) "foooo"
> >> comment aut 'USE mpi' .... an see your error (SIGSEGV) again, now
> >> without any MPI part in the program.
> >> So my suspiction is this is an bug in your GCC version. Especially
> >> because there is no SIGSEGV using 4.7.2 GCC (whereby it crasehs using
> >> 4.4.6)
> >>
> >> ==>  Update your compilers!
> >>
> >>
> >> On 01/11/13 14:01, Stefan Mauerberger wrote:
> >>> Hi There!
> >>>
> >>> First of all, this is my first post here. In case I am doing
> >>> something inappropriate pleas be soft with me. On top of that I am
> >>> not quite sure whether that issue is related to Open MPI or GCC.
> >>>
> >>> Regarding my problem: Well, it is a little bulky, see below. I could
> >>> figure out that the actual crash is caused by invoking Fortran's
> >>> array constructor [ xx, yy ] on derived-data-types xx and yy. The
> >>> one key factor is that those types have allocatable member variables.
> >>> Well, that fact points to blame gfortran for that. However, the
> >>> crash does not occur if MPI_Iinit is not called in before. Compiled
> >>> as a serial program everything works perfectly fine. I am pretty
> >>> sure, the lines I wrote are valid F2003 code.
> >>>
> >>> Here is a minimal working example:
> >>> PROGRAM main
> >>>       USE mpi
> >>>
> >>>       IMPLICIT NONE
> >>>
> >>>       INTEGER :: ierr
> >>>
> >>>       TYPE :: test_typ
> >>>           REAL, ALLOCATABLE :: a(:)
> >>>       END TYPE
> >>>
> >>>       TYPE(test_typ) :: xx, yy
> >>>       TYPE(test_typ), ALLOCATABLE :: conc(:)
> >>>
> >>>       CALL mpi_init( ierr )
> >>>
> >>>       conc = [ xx, yy ]
> >>>
> >>>       CALL mpi_finalize( ierr )
> >>>
> >>> END PROGRAM main
> >>> Just compile with mpif90 ... and execute leads to:
> >>>> *** glibc detected *** ./a.out: free(): invalid pointer:
> >>>> 0x00007fefd2a147f8 *** ======= Backtrace: =========
> >>>> /lib/x86_64-linux-gnu/libc.so.6(+0x7eb96)[0x7fefd26dab96]
> >>>> ./a.out[0x400fdb]
> >>>> ./a.out(main+0x34)[0x401132]
> >>>> /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed)[0x7fefd267d
> >>>> 76d]
> >>>> ./a.out[0x400ad9]
> >>> With commenting out 'CALL MPI_Init' and 'MPI_Finalize' everything
> seems to be fine.
> >>>
> >>> What do you think: Is this a OMPI or a GCC related bug?
> >>>
> >>> Cheers,
> >>> Stefan
> >>>
> >>>
> >>> _______________________________________________
> >>> users mailing list
> >>> us...@open-mpi.org
> >>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> >>>
> >>
> >>
> >
> >
> > _______________________________________________
> > users mailing list
> > us...@open-mpi.org
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users
--
Hristo Iliev, Ph.D. -- High Performance Computing
RWTH Aachen University, Center for Computing and Communication
Rechen- und Kommunikationszentrum der RWTH Aachen
Seffenter Weg 23,  D 52074  Aachen (Germany)

Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to