Fixed on master
> On Jul 13, 2016, at 12:47 PM, Jeff Squyres (jsquyres)
> wrote:
>
> I literally just noticed that this morning (that singleton was broken on
> master), but hadn't gotten to bisecting / reporting it yet...
>
> I also haven't tested 2.0.0. I really hope singletons aren't broke
Paul,
thanks for testing the workaround
/* i was on a trip and could not do it myself */
At first glance, i agree with Jeff and the root cause seems to be a
CMake bug.
/* i cannot find any rationale for automatically including some
directories that were not requested by the user */
not
Eric,
OpenMPI 2.0.0 has been released, so the fix should land into the v2.x
branch shortly.
If i understand correctly, you script download/compile OpenMPI and then
download/compile PETSc.
In this is correct, and for the time being, feel free to patch Open MPI
v2.x before compiling it, the
Hi,
FYI: I've tested the SHA e28951e
From git clone launched around 01h19:
http://www.giref.ulaval.ca/~cmpgiref/dernier_ompi/2016.07.13.01h19m30s_config.log
Eric
On 13/07/16 04:01 PM, Pritchard Jr., Howard wrote:
Jeff,
I think this was fixed in PR 1227 on v2.x
Howard
Jeff,
I think this was fixed in PR 1227 on v2.x
Howard
--
Howard Pritchard
HPC-DES
Los Alamos National Laboratory
On 7/13/16, 1:47 PM, "devel on behalf of Jeff Squyres (jsquyres)"
wrote:
>I literally just noticed that this morning (that singleton was broken on
>master), but hadn't gotte
I literally just noticed that this morning (that singleton was broken on
master), but hadn't gotten to bisecting / reporting it yet...
I also haven't tested 2.0.0. I really hope singletons aren't broken then...
/me goes to test 2.0.0...
Whew -- 2.0.0 singletons are fine. :-)
> On Jul 13, 20
Hmmm…I see where the singleton on master might be broken - will check later
today
> On Jul 13, 2016, at 11:37 AM, Eric Chamberland
> wrote:
>
> Hi Howard,
>
> ok, I will wait for 2.0.1rcX... ;)
>
> I've put in place a script to download/compile OpenMPI+PETSc(3.7.2) and our
> code from the g
Hi Howard,
ok, I will wait for 2.0.1rcX... ;)
I've put in place a script to download/compile OpenMPI+PETSc(3.7.2) and
our code from the git repos.
Now I am in a somewhat uncomfortable situation where neither the
ompi-release.git or ompi.git repos are working for me.
The first gives me the
Hi,
Is there any option in mpirun which enables us to switch dynamically from
shmem mode to knem/xpmem mode beyond a specifiable message size?
This is because, according to my tests, knem performs better than shmem
only at large message sizes.
--
Abhishek
Thanks for the report. I don't know much about OSHMEM, but I'm guessing the
files were laid out that way for a reason (e.g., maybe the OSHMEM spec calls
for both of those files to exist?).
I've filed an issue here to track it:
https://github.com/open-mpi/ompi/issues/1868
Additionally, hav
Thanks Ben. Rainer Keller just filed a PR for this -- we'll get it in v2.0.1:
https://github.com/open-mpi/ompi/pull/1867
> On Jul 12, 2016, at 12:08 AM, Ben Menadue wrote:
>
> Hi,
>
> Looks like there's a #include missing from
> oshmem/shmem/fortran/shmem_put_nb_f.c. It's causing MCA_SPM
Hi Eric,
Thanks very much for finding this problem. We decided in order to have a
reasonably timely
release, that we'd triage issues and turn around a new RC if something
drastic
appeared. We want to fix this issue (and it will be fixed), but we've
decided to
defer the fix for this issue to a 2
Hi Gilles,
On 07/13/16 01:10, Gilles Gouaillardet wrote:
Paul,
The two header files in include/mpp simply include the file with the same name
in the upper directory.
Yessir!
(and CMake do not care about the upper directory and build infinite loop)
A simple workaround is to replace these tw
13 matches
Mail list logo