Hi all,
Sorry for the noise. Was intended as a private reply.
Cheers,
Marc-Andre
On 04.02.20 08:10, Marc-André Hermanns wrote:
> Hi Jeff, Hi Martin,
>
> argh ... sorry for being so late on this (see my other mail), but
> could you take the two calls
>
> MPI_T_pvar_handle_alloc
> MPI_T_cvar_han
Hi Jeff, Hi Martin,
argh ... sorry for being so late on this (see my other mail), but
could you take the two calls
MPI_T_pvar_handle_alloc
MPI_T_cvar_handle_alloc
out of the embiggening? The MPI_Count for the count argument does not
really help in practice, so we (the tools WG) decided we would
The Embiggenment WG is ready for a formal reading at the Feb 2020 Portland MPI
Forum physical meeting.
Issue: https://github.com/mpi-forum/mpi-issues/issues/137
PR:https://github.com/mpi-forum/mpi-standard/pull/132
PDF: attached to Feb 3, 2020 comments on the issue/PR
(this PDF inclu
Thank you to all Chapter Chairs/Committees who completed reviews. The
Pythonization effort is nearly complete.
Note that this is essentially a "ticket 0" PR. Based on the feedback from the
ABQ Forum meeting and discussions afterwards, we strove to make Pythonization
as much of a no-op as po
We’ll be moving from our existing Jenkins server to GitHub Actions. The benefit
is that this runs on GitHub’s servers and is easy to update when we need to
update the environment (e.g. requiring a newer version of Python like we do
now). I’ve been able to test this in a side repository and it wo
Wesley,
I do not have a strong opinion on this question. However, you keep
mentioning that the testing infrastructure is broken. I wasn't at the
last meeting so I must have missed the discussion there. How would
moving to a repo restore the testing infrastucture?
Cheers
Joseph
On 2/3/20 8:5
I really want to emphasize how little work this should be for each person. For
people with lots of PRs (like you), I estimate it’ll be less than 15 minutes
before the meeting and less than 15 minutes afterward. For people with no open
pull requests, it’s 0 minutes. If people are having trouble,
Hi all,
I had to do a small bugfix in #153
Bug in MPI_NEIGHBOR_ALLTOALL--1 or 2 processes in the cyclic Cartesian case
Therefore latest pdf for errata issue #153 is:
https://github.com/mpi-forum/mpi-issues/files/4149883/mpi-report-issue153-neighbor-errata-2020-01-30-annotated-corr-2020-02-03.pdf
Hi Wesley,
This change really won’t be as painful as it sounds
Then why do I have to do a bunch of unexpected/unscheduled backup and
book-keeping work before an arbitrarily chosen deadline? I’m not sure that
springing this surprise on folks is the wisest course of action.
Cheers,
Dan.
—
Dr Dan
Hi all,
The Sessions WG would like to announce a no-no vote for changes since the last
face-to-face meeting and the 1st vote for the whole sessions proposal. If the
no-no vote fails, then I will read the whole ticket again and reset the
procedural clock.
The ticket/issue is #103: https://githu
Hello Everyone,
I'd like to announce the following tickets for a second vote at the MPI Forum
in Portland. They are unchanged from the tickets that passed first vote in
Albuquerque.
?Ticket #146: https://github.com/mpi-forum/mpi-issues/issues/146 This solution
uses a single function MPI_Info_g
Hi all,
The Semantic Terms Working Group would like to request
- a re-reading of issue #96 "Semantic Terms"
- a no-no-vote for the changes since Albuquerque added for
-- removing changes to MPI-3.1 that were never intended:
- correct definition of local and non-local
Hi all,
The Hardware Topologies Working Group would like to additional request
- a re-reading of issue #120 MPI_Cart/Dims_create_weighted
- a no-no-vote for the changes since Albuquerque added for
- better readable text
- having a significantly shorter halo example
- a full readi
Hi all,
The Hardware Topologies Working Group would like to request the first
vote for the following items:
- Hardware split, guided mode
* Issue: https://github.com/mpi-forum/mpi-issues/issues/132
* PR: https://github.com/mpi-forum/mpi-standard/pull/142
* PDF:
https://github.com/mpi-forum/mpi
Because in the meantime, our testing remains broken and we don’t have another
solution to fix it at the moment. We really want to have that testing fix.
This change really won’t be as painful as it sounds. There’s almost no Git
commands required. No rebasing/merging/etc.
> On Feb 3, 2020, at 9
Why aren't we doing this when we finished 4.0, i.e. merged all the voted in
pull requests into the standard?
- Original Message -
> From: "Main MPI Forum mailing list"
> To: "Main MPI Forum mailing list"
> Cc: "Wesley Bland"
> Sent: Monday, February 3, 2020 4:38:48 PM
> Subject: [Mpi-fo
Hi folks,
** Look at the numbered list for your action items before 2020-02-18! **
Since we moved from Subversion to GitHub, there’s been a back and forth on
whether the MPI Standard Source code should be open source or closed source. As
you know (since most of you have access to it, we decided
17 matches
Mail list logo