There's been a lot of turnover in this exact portion of the code base
on the SVN trunk in the last week or three.
Ralph -- can you comment on where we are?
On Apr 26, 2008, at 2:07 PM, Alberto Giannetti wrote:
Doesn't seem to work. This is the appfile I'm using:
# Application context files
Doesn't seem to work. This is the appfile I'm using:
# Application context files specify each sub-application in the
# parallel job, one per line.
# Server
-np 2 server
# Client
-np 1 client 0.1.0:2001
And the output:
mpirun --app ./appfile
Processor 0 (3659, Receiver) initialized
Processor 1 (
This scenario is known to be buggy in some versions of Open MPI. It is
now fixed in svn version and will be part of the 1.3 release.
To quick fix your application, you'll need to spawn both applications
with the same mpirun, with MPMD syntax. However this will have the
adverse effect of hav
I want to connect two MPI programs through the MPI_Comm_connect/
MPI_Comm_Accept API.
This is my server app:
int main(int argc, char* argv[])
{
int rank, count;
int i;
float data[100];
char myport[MPI_MAX_PORT_NAME];
MPI_Status status;
MPI_Comm intercomm;
MPI_Init(&argc, &argv);