Yeah, I recall it was quite clean when I did the upgrade on the trunk. I
may take a pass at it and see if anything breaks since it is so easy now to
do. :-)



On Mon, Dec 15, 2014 at 8:17 AM, Brice Goglin <brice.gog...@inria.fr> wrote:
>
> Le 15/12/2014 16:39, Jeff Squyres (jsquyres) a écrit :
> > The only real question is: will upgrading hwloc break anything else
> inside the v1.8 tree?  E.g., did new hwloc abstractions/APIs come in after
> v1.7 that we've adapted to on the trunk, but didn't adapt to on the v1.8
> branch?
>
> I wouldn't expect any such problem when upgrading from hwloc 1.7 to 1.9.
>
> Brice
>
>
> >
> >
> >
> > On Dec 15, 2014, at 10:35 AM, Ralph Castain <r...@open-mpi.org> wrote:
> >
> >> Sorry, I should have been clearer - that was indeed what I was
> expecting to see. I guess it begs the question - should we just update to
> something like 1.9 so Brice doesn't have to worry about back porting future
> fixes this far back?
> >>
> >>
> >>
> >> On Mon, Dec 15, 2014 at 7:22 AM, Jeff Squyres (jsquyres) <
> jsquy...@cisco.com> wrote:
> >> FWIW, if it would be easier, we can just pull a new hwloc tarball --
> that's how we've done it in the past (vs. trying to pull individual
> patches).  It's also easier to pull a release tarball, because then we can
> say "hwloc vX.Y.Z is in OMPI vA.B.C", rather than have to try to
> examine/explain what exact level of hwloc is in OMPI (based on patches,
> etc.).
> >>
> >>
> >> On Dec 15, 2014, at 4:39 AM, Brice Goglin <brice.gog...@inria.fr>
> wrote:
> >>
> >>> Le 15/12/2014 10:35, Jorge D'Elia a écrit :
> >>>> Hi Brice,
> >>>>
> >>>> ----- Mensaje original -----
> >>>>> De: "Brice Goglin" <brice.gog...@inria.fr>
> >>>>> CC: "Open MPI Users" <us...@open-mpi.org>
> >>>>> Enviado: Jueves, 11 de Diciembre 2014 19:46:44
> >>>>> Asunto: Re: [OMPI users] OpenMPI 1.8.4 and hwloc in Fedora 14 using
> a beta gcc 5.0 compiler.
> >>>>>
> >>>>> This problem was fixed in hwloc upstream recently.
> >>>>>
> >>>>>
> https://github.com/open-mpi/hwloc/commit/790aa2e1e62be6b4f37622959de9ce3766ebc57e
> >>>> Great! However, yesterday I downloaded the versions 1.8.3 (stable) and
> >>>> 1.8.4rc3 of OpenMPI, and tried to use its more traditional
> configuration.
> >>>> It was OK on ia64 (as before) but failed again on ia32.  Then again,
> >>>> I had to use the external installation of hwloc in order to fix it.
> >>>>
> >>> It's fixed in "upstream hwloc", not in OMPI yet. I have prepared a long
> >>> branch of hwloc fixes that OMPI should pull, but it will take some
> time.
> >>> thanks
> >>> Brice
> >>>
> >>> _______________________________________________
> >>> users mailing list
> >>> us...@open-mpi.org
> >>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> >>> Link to this post:
> http://www.open-mpi.org/community/lists/users/2014/12/25995.php
> >>
> >> --
> >> Jeff Squyres
> >> jsquy...@cisco.com
> >> For corporate legal information go to:
> http://www.cisco.com/web/about/doing_business/legal/cri/
> >>
> >> _______________________________________________
> >> users mailing list
> >> us...@open-mpi.org
> >> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> >> Link to this post:
> http://www.open-mpi.org/community/lists/users/2014/12/25996.php
> >> _______________________________________________
> >> users mailing list
> >> us...@open-mpi.org
> >> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> >> Link to this post:
> http://www.open-mpi.org/community/lists/users/2014/12/25998.php
> >
>
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post:
> http://www.open-mpi.org/community/lists/users/2014/12/26000.php
>

Reply via email to