On MacOS with gcc 7.3
> On Sep 11, 2018, at 3:02 PM, Jeff Squyres (jsquyres) via devel
> wrote:
>
> Ralph --
>
> What OS / compiler are you using?
>
> I just compiled on MacOS (first time in a while) and filed a PR and a few
> issues about the warnings I found, but I cannot replicate these
Ralph --
What OS / compiler are you using?
I just compiled on MacOS (first time in a while) and filed a PR and a few
issues about the warnings I found, but I cannot replicate these warnings. I
also built with gcc 7.3.0 on RHEL; couldn't replicate the warnings.
On MacOS, I'm using the default
I believe the problem is actually a little different than you described. The
issue occurs whenever the #procs combined with PE exceeds the number of cores
on a node. It is caused by the fact that we aren’t considering the PE number
when mapping processes - we only appear to be looking at it when
Works for me.
> On Sep 11, 2018, at 12:35 PM, Ralph H Castain wrote:
>
> Hi folks
>
> Per today’s telecon, I have moved the Perl MTT client into its own
> repository: https://github.com/open-mpi/mtt-legacy. All the Python client
> code has been removed from that repo.
>
> The original MTT re
On Sep 11, 2018, at 2:17 PM, Jeff Squyres (jsquyres) via devel
wrote:
>
>> diff --git a/VERSION b/VERSION
>> index 6fadf03..a9706a3 100644
>> --- a/VERSION
>> +++ b/VERSION
>
>> +libmpi_mpifh_so_version=61:0:21
>
> Just curious: any reason this one is 60 and all the others are 61?
Er -- I sai
On Sep 9, 2018, at 4:29 PM, Gitdub wrote:
>
> diff --git a/VERSION b/VERSION
> index 6fadf03..a9706a3 100644
> --- a/VERSION
> +++ b/VERSION
> +libmpi_mpifh_so_version=61:0:21
Geoff --
Just curious: any reason this one is 60 and all the others are 61?
--
Jeff Squyres
jsquy...@cisco.com
___
Hi folks
Per today’s telecon, I have moved the Perl MTT client into its own repository:
https://github.com/open-mpi/mtt-legacy. All the Python client code has been
removed from that repo.
The original MTT repo remains at https://github.com/open-mpi/mtt. I have a PR
to remove all the Perl clien
Here's the xml output from lstopo. Thank you for taking a look!
David
From: devel on behalf of Ralph H Castain
Sent: Monday, September 10, 2018 5:12 PM
To: OpenMPI Devel
Subject: Re: [OMPI devel] mpirun error when not using span
Could you please send the outpu
I notice from your configure log that you're building Mellanox MXM support.
Does that pull in libibverbs as a dependent library?
> On Sep 11, 2018, at 7:23 AM, Mijakovic, Robert
> wrote:
>
> Hi guys,
>
> I have configured OpenMPI to build without-verbs but the build fails with an
> error sa