[OMPI users] orted seg fault when using MPI_Comm_spawn on more than one host

2015-01-26 Thread Evan
entries in the hostfile are 'localhost'. However if I add a host that isn't local I get a segmentation fault from the orted process. In any case, I distilled my example down as small as I could. I've attached the C code of the master and the hostfile I'm using. He

Re: [OMPI users] orted seg fault when using MPI_Comm_spawn on more than one host

2015-02-03 Thread Evan Samanas
al use I'm launching something that calls MPI_Init. Evan

Re: [OMPI users] orted seg fault when using MPI_Comm_spawn on more than one host

2015-02-03 Thread Evan Samanas
Setting these environment variables did indeed change the way mpirun maps things, and I didn't have to specify a hostfile. However, setting these for my MPI_Comm_spawn code still resulted in the same segmentation fault. Evan On Tue, Feb 3, 2015 at 10:09 AM, Ralph Castain wrote: > If

Re: [OMPI users] orted seg fault when using MPI_Comm_spawn on more than one host

2015-02-03 Thread Evan Samanas
it > should resolve that problem. > > I agree that providing either hostfile or host as an Info key will cause > the program to segfault - I'm woking on that issue. > > > On Tue, Feb 3, 2015 at 3:46 PM, Evan Samanas > wrote: > >> Setting these environment variable

Re: [OMPI users] orted seg fault when using MPI_Comm_spawn on more than one host

2015-02-04 Thread Evan Samanas
with localhost and 5 remote hosts, here's the output: evan@lasarti:~/devel/toy_progs/mpi_spawn$ mpicc simple_spawn.c -o simple_spawn evan@lasarti:~/devel/toy_progs/mpi_spawn$ ./simple_spawn [pid 5703] starting up! 0 completed MPI_Init Parent [pid 5703] about to spawn! [lasarti:05703] [[14661

Re: [OMPI users] orted seg fault when using MPI_Comm_spawn on more than one host

2015-02-06 Thread Evan Samanas
ies if I'm misguided. Evan On Thu, Feb 5, 2015 at 9:51 AM, Ralph Castain wrote: > Okay, I tracked this down - thanks for your patience! I have a fix pending > review. You can track it here: > > https://github.com/open-mpi/ompi-release/pull/179 > > > On Feb 4, 2015, at 5:14

[OMPI users] Problems Using PVFS2 with OpenMPI

2010-01-12 Thread Evan Smyth
I am unable to use PVFS2 with OpenMPI in a simple test program. My configuration is given below. I'm running on RHEL5 with GigE (probably not important). OpenMPI 1.4 (had same issue with 1.3.3) is configured with ./configure --prefix=/work/rd/evan/archives/openmpi/openmpi/1.4/enable

Re: [OMPI users] Problems Using PVFS2 with OpenMPI

2010-01-14 Thread Evan Smyth
Build pvfs2 correctly (I get conflicting info on whether the "--with-mpi=..." is needed but FWIW, this is how I built it and it installs into /usr/local which is it's default location... cd setenv CFLAGS -fPIC ./configure --with-mpi=/work/rd/evan/archives/openmpi/openmpi/1.4/ena

[OMPI users] openmpi equivalent to mpich serv_p4 daemon

2007-01-19 Thread Evan Smyth
a trivial problem but I'm hoping someone can give me an openmpi equivalent usage example. Thanks in advance, Evan -- -- Evan Smyth e...@dreamworks.com Dreamworks Animation 818.695.4105, Riverside 146