Bummer - thanks for the update.  I will revert back to 1.10.x for now
then.  Should I file a bug report for this on GitHub or elsewhere?  Or if
there's an issue for this already open, can you point me to it so I can
keep track of when it's fixed?  Any best guess calendar-wise as to when you
expect this to be fixed?

Thanks.

On Mon, Mar 13, 2017 at 10:45 AM, r...@open-mpi.org <r...@open-mpi.org> wrote:

> You should consider it a bug for now - it won’t work in the 2.0 series,
> and I don’t think it will work in the upcoming 2.1.0 release. Probably will
> be fixed after that.
>
>
> On Mar 13, 2017, at 5:17 AM, Adam Sylvester <op8...@gmail.com> wrote:
>
> As a follow-up, I tried this with Open MPI 1.10.4 and this worked as
> expected (the port formatting looks really different):
>
> $ mpirun -np 1 ./server
> Port name is 1286733824.0;tcp://10.102.16.135:43074+1286733825.0;tcp://
> 10.102.16.135::300
> Accepted!
>
> $ mpirun -np 1 ./client "1286733824.0;tcp://10.102.16.
> 135:43074+1286733825.0;tcp://10.102.16.135::300"
> Trying with '1286733824.0;tcp://10.102.16.135:43074+1286733825.0;tcp://
> 10.102.16.135::300'
> Connected!
>
> I've found some other posts of users asking about similar things regarding
> the 2.x release - is this a bug?
>
> On Sun, Mar 12, 2017 at 9:38 PM, Adam Sylvester <op8...@gmail.com> wrote:
>
>> I'm using Open MPI 2.0.2 on RHEL 7.  I'm trying to use MPI_Open_port() /
>> MPI_Comm_accept() / MPI_Conn_connect().  My use case is that I'll have two
>> processes running on two machines that don't initially know about each
>> other (i.e. I can't do the typical mpirun with a list of IPs); eventually I
>> think I may need to use ompi-server to accomplish what I want but for now
>> I'm trying to test this out running two processes on the same machine with
>> some toy programs.
>>
>> server.cpp creates the port, prints it, and waits for a client to accept
>> using it:
>>
>> #include <mpi.h>
>> #include <iostream>
>>
>> int main(int argc, char** argv)
>> {
>>     MPI_Init(NULL, NULL);
>>
>>     char myport[MPI_MAX_PORT_NAME];
>>     MPI_Comm intercomm;
>>
>>     MPI_Open_port(MPI_INFO_NULL, myport);
>>     std::cout << "Port name is " << myport << std::endl;
>>
>>     MPI_Comm_accept(myport, MPI_INFO_NULL, 0, MPI_COMM_SELF, &intercomm);
>>
>>     std::cout << "Accepted!" << std::endl;
>>
>>     MPI_Finalize();
>>     return 0;
>> }
>>
>> client.cpp takes in this port on the command line and tries to connect to
>> it:
>>
>> #include <mpi.h>
>> #include <iostream>
>>
>> int main(int argc, char** argv)
>> {
>>     MPI_Init(NULL, NULL);
>>
>>     MPI_Comm intercomm;
>>
>>     const std::string name(argv[1]);
>>     std::cout << "Trying with '" << name << "'" << std::endl;
>>     MPI_Comm_connect(name.c_str(), MPI_INFO_NULL, 0, MPI_COMM_SELF,
>> &intercomm);
>>
>>     std::cout << "Connected!" << std::endl;
>>
>>     MPI_Finalize();
>>     return 0;
>> }
>>
>> I run the server first:
>> $ mpirun ./server
>> Port name is 2720137217.0:595361386
>>
>> Then a second later I run the client:
>> $ mpirun ./client 2720137217.0:595361386
>> Trying with '2720137217.0:595361386'
>>
>> Both programs hang for awhile and then eventually time out.  I have a
>> feeling I'm misunderstanding something and doing something dumb but from
>> all the examples I've seen online it seems like this should work.
>>
>> Thanks for the help.
>> -Adam
>>
>
> _______________________________________________
> users mailing list
> users@lists.open-mpi.org
> https://rfd.newmexicoconsortium.org/mailman/listinfo/users
>
>
>
> _______________________________________________
> users mailing list
> users@lists.open-mpi.org
> https://rfd.newmexicoconsortium.org/mailman/listinfo/users
>
_______________________________________________
users mailing list
users@lists.open-mpi.org
https://rfd.newmexicoconsortium.org/mailman/listinfo/users

Reply via email to