Thanks, Nathan.
There’s no mpi.h available on the PR builder hosts, so something works out.
Haven’t thought through that path, however.
Brian
> On Jun 22, 2017, at 6:04 PM, Nathan Hjelm wrote:
>
> I have a fix I am working on. Will open a PR tomorrow morning.
>
> -Nathan
>
>> On Jun 22, 20
I have a fix I am working on. Will open a PR tomorrow morning.
-Nathan
> On Jun 22, 2017, at 6:11 PM, r...@open-mpi.org wrote:
>
> Here’s something even weirder. You cannot build that file unless mpi.h
> already exists, which it won’t until you build the MPI layer. So apparently
> what is happ
Here’s something even weirder. You cannot build that file unless mpi.h already
exists, which it won’t until you build the MPI layer. So apparently what is
happening is that we somehow pickup a pre-existing version of mpi.h and use
that to build the file?
Checking around, I find that all my avai
It apparently did come in that way. We just never test -no-ompi and so it
wasn’t discovered until a downstream project tried to update. Then...boom.
> On Jun 22, 2017, at 4:07 PM, Barrett, Brian via devel
> wrote:
>
> I’m confused; looking at history, there’s never been a time when
> opal/ut
I’m confused; looking at history, there’s never been a time when
opal/util/info.c hasn’t included mpi.h. That seems odd, but so does info being
in opal.
Brian
> On Jun 22, 2017, at 3:46 PM, r...@open-mpi.org wrote:
>
> I don’t understand what someone was thinking, but you CANNOT #include “mpi
I don’t understand what someone was thinking, but you CANNOT #include “mpi.h”
in opal/util/info.c. It has broken pretty much every downstream project.
Please fix this!
Ralph
___
devel mailing list
devel@lists.open-mpi.org
https://rfd.newmexicoconsortiu
Hi Folks,
I'm trying to do some experiments with clang/llvm and its openmp runtime.
To add to this mix, the application I'm wanting to use for testing is
written in F08, so I'm having to also use flang:
https://github.com/flang-compiler/flang
Now when I try to build Open MPI, as long as I disabl
+1 Jenkins really is the best of the worst. Definitely not fun to maintain.
On Thu, Jun 22, 2017 at 10:28 AM, Barrett, Brian via devel <
devel@lists.open-mpi.org> wrote:
> As a fellow Jenkins maintainer, thanks for all the work :).
>
> Brian
>
> On Jun 22, 2017, at 7:35 AM, Joshua Ladd wrote:
>
As a fellow Jenkins maintainer, thanks for all the work :).
Brian
On Jun 22, 2017, at 7:35 AM, Joshua Ladd
mailto:jladd.m...@gmail.com>> wrote:
Update - Mellanox Jenkins is back to normal. All previously failing PRs have
been retrigged. Thanks for your patience.
Best,
Josh Ladd
On Wed, Jun
Hi Chris
Please go ahead and open a PR for master and I'll open corresponding ones
for the release branches.
Howard
Christoph Niethammer schrieb am Do. 22. Juni 2017 um
01:10:
> Hi Howard,
>
> Sorry, missed the new license policy. I added a Sign-off now.
> Shall I open a pull request?
>
> Best
Update - Mellanox Jenkins is back to normal. All previously failing PRs
have been retrigged. Thanks for your patience.
Best,
Josh Ladd
On Wed, Jun 21, 2017 at 8:25 PM, Artem Polyakov wrote:
> Brian, I'm going to push for the fix tonight. If won't work - we will do
> as you advised.
>
> 2017-06
Hi Howard,
Sorry, missed the new license policy. I added a Sign-off now.
Shall I open a pull request?
Best
Christoph
- Original Message -
From: "Howard Pritchard"
To: "Open MPI Developers"
Sent: Wednesday, June 21, 2017 5:57:05 PM
Subject: Re: [OMPI devel] orte-clean not cleaning left
12 matches
Mail list logo