Hi Gilles,

I'm using both openmpi and intel mpi. I have with both problem with the
communicators. Therefore, I tried to get some infos about them.

Thx a lot for your help.
Have a nice day

On 06/24/2022 02:14 PM, Gilles Gouaillardet via users wrote:
> Guillaume,
> 
> MPI_Comm is an opaque handler that should not be interpreted by an end user.
> 
> Open MPI chose to implement is as an opaque pointer, and MPICH chose to
> implement it as a 32 bits unsigned integer.
> The 44000000 value strongly suggests you are using MPICH and you are
> hence posting to the wrong mailing list
> 
> 
> Cheers,
> 
> Gilles
> 
> On Fri, Jun 24, 2022 at 9:06 PM Guillaume De Nayer via users
> <users@lists.open-mpi.org <mailto:users@lists.open-mpi.org>> wrote:
> 
>     Hi Gilles,
> 
>     MPI_COMM_WORLD is positive (44000000).
> 
>     In a short code I wrote I have something like that:
> 
>     MPI_Comm_dup(MPI_COMM_WORLD, &world);
>     cout << "intra-communicator: " << "world" << "---" << hex << world
>     << endl;
> 
>     It returns "84000006" (in hex).
> 
>     later I have:
> 
>     MPI_Comm_accept(port_name, MPI_INFO_NULL, 0, world, &interClient);
>     cout << "intercommunicator interClient=" << interClient << endl;
> 
>     After connection from a third party client it returns "c4000003" (in
>     hex).
> 
>     Both 84000006 and c4000003 are negative integer in dec.
> 
>     I don't know if it is "normal". Therefore I'm looking about rules on the
>     communicators, intercommunicators.
> 
>     Regards,
>     Guillaume
> 
> 
>     On 06/24/2022 11:56 AM, Gilles Gouaillardet via users wrote:
>     > Guillaume,
>     >
>     > what do you mean by (the intercommunicators are all negative"?
>     >
>     >
>     > Cheers,
>     >
>     > Gilles
>     >
>     > On Fri, Jun 24, 2022 at 4:23 PM Guillaume De Nayer via users
>     > <users@lists.open-mpi.org <mailto:users@lists.open-mpi.org>
>     <mailto:users@lists.open-mpi.org <mailto:users@lists.open-mpi.org>>>
>     wrote:
>     >
>     >     Hi,
>     >
>     >     I am new on this list. Let me introduce myself shortly: I am a
>     >     researcher in fluid mechanics. In this context I am using
>     softwares
>     >     related on MPI.
>     >
>     >     I am facing a problem:
>     >     - 3 programs forms a computational framework. Soft1 is a coupling
>     >     program, i.e., it opens an MPI port at the beginning. Soft2
>     and Soft3
>     >     are clients, which connect to the coupling program using
>     >     MPI_Comm_connect.
>     >     - After the start and the connections of Soft2 and Soft3 with
>     Soft1, it
>     >     hangs.
>     >
>     >     I started to debug this issue and as usual I found another
>     issue (or
>     >     perhaps it is not an issue):
>     >     - The intercommunicators I get between Soft1-Soft2 and
>     Soft1-Soft3 are
>     >     all negative (running on CentOS 7 with infiniband Mellanox
>     OFED driver).
>     >     - Is there some standard about communicator? I don't find anything
>     >     about
>     >     this topic.
>     >     - What is a valid communicator, intercommunicator?
>     >
>     >     thx a lot
>     >     Regards
>     >     Guillaume
>     >
> 
> 


Reply via email to