Just to throw some $0.002 into this overall discussion...
Not knowing this was going to be happening, I was actually about to
propose moving the opal/util/arch.c code back to the ompi layer. The
original move had caused quite a bit of angst due to the fortran
stuff. Originally, I had needed
Thanks, Jeff!
On Monday 01 June 2009 04:53:19 pm Jeff Squyres wrote:
> Per the MPI_Flogical issue -- I think Rainer just exposed some old
> ugliness. We've apparently had MPI_Flogical defined in
> ompi_config.h.in for a long, long time -- we used it in some places
> and used ompi_fortran_logical
Per the MPI_Flogical issue -- I think Rainer just exposed some old
ugliness. We've apparently had MPI_Flogical defined in
ompi_config.h.in for a long, long time -- we used it in some places
and used ompi_fortran_logical_t in other places.
Even though I *may* be responsible for this particu
Turned out to be a faulty svn update. Getting a clean svn checkout
fixed the problem.
On Jun 1, 2009, at 10:04 AM, Rainer Keller wrote:
Hi Ralph,
of course, at first I was afraid this was afraid that these were
related to
pulling the OMPI_ALIGNMENT (and friends) configure to the OPAL lay
Sounds like a real simple s/OPAL/OMPI fix, so I'll give it a go tonight.
On Mon, Jun 1, 2009 at 2:17 PM, Jeff Squyres wrote:
> I think a patch was put back to v1.3 that wasn't quite right -- I see
> pml_ob1_recvreq.h:183 and 223 have OPAL_HAVE_THREAD_SUPPORT. But
> OPAL_HAVE_THREAD_SUPPORT is
I think a patch was put back to v1.3 that wasn't quite right -- I see
pml_ob1_recvreq.h:183 and 223 have OPAL_HAVE_THREAD_SUPPORT. But
OPAL_HAVE_THREAD_SUPPORT isn't defined on the trunk -- only
OMPI_HAVE_THREAD_SUPPORT is defined.
Can someone fix?
Thanks...
--
Jeff Squyres
Cisco Systems
Well, this may just be another sign that the push of the DDT to OPAL is a
bad idea. That's been my opinion from the start, so I'm biased. But OPAL
was intended to be single process systems portability, not MPI crud.
Brian
On Mon, 1 Jun 2009, Rainer Keller wrote:
Hmm, OK, I see.
However, I
Hmm, OK, I see.
However, I do see potentially a problem with work getting ddt on the OPAL
layer when we do have a fortran compiler with different alignment requirements
for the same-sized basic types...
As far as I understand the OPAL layer to abstract away from underlying system
portability, l
The opal/util/arch.c stuff also had this concern - but we couldn't figure
out a solution. One thing we talked about was separating the fortran arch
stuff away from the rest as the only thing I needed in opal were the
non-fortran things, but we deferred that to later.
Adding all the rest of the for
I have to agree with Jeff's concerns.
Brian
On Mon, 1 Jun 2009, Jeff Squyres wrote:
Hmm. I'm not sure that I like this commit.
George, Brian, and I specifically kept Fortran out of (the non-generated code
in) opal because the MPI layer is the *only* layer that uses Fortran. There
was one
Hmm. I'm not sure that I like this commit.
George, Brian, and I specifically kept Fortran out of (the non-
generated code in) opal because the MPI layer is the *only* layer that
uses Fortran. There was one or two minor abstraction breaks (you
cited opal/util/arch.c), but now we have Fortra
My fault - I copied the Makefile.am over from another place and didn't
notice that line. Sorry for the problem...
On Mon, Jun 1, 2009 at 8:07 AM, wrote:
> Author: jsquyres
> Date: 2009-06-01 10:07:08 EDT (Mon, 01 Jun 2009)
> New Revision: 21340
> URL: https://svn.open-mpi.org/trac/ompi/changes
Hi Ralph,
of course, at first I was afraid this was afraid that these were related to
pulling the OMPI_ALIGNMENT (and friends) configure to the OPAL layer (r21330),
but failures I seen in MTT are related to windows (r21334).
Well AM_CONDITIONAL(WANT_PERUSE...) is in ./config/ompi_configure_optio
Hi folks
I'm getting the following build failures this morning - looks like something
crept in over the weekend?
ompi/peruse/Makefile.am:19: WANT_PERUSE does not appear in AM_CONDITIONAL
ompi/Makefile.am:155: `ompi/peruse/Makefile.am' included from here
ompi/Makefile.am: installing `config/depc
14 matches
Mail list logo