In general, Open MPI doesn't have anything to do with X forwarding. However, if you're using ssh to startup your processes, ssh may configure X forwarding for you (depending on your local system setup). But OMPI closes down ssh channels once applications have launched (there's no need to keep them open), so any X forwarding that may have been setup will be closed down.

The *easiest* way to setup X forwarding is simply to allow X connections to your local host from the node(s) that will be running your application. E.g., use the "xhost" command to add the target nodes into the access list. And then have mpirun export a suitable DISPLAY variable, such as:

export DISPLAY=my_hostname:0
mpirun -x DISPLAY ...

The "-x DISPLAY" clause tells Open MPI to export the value of the DISPLAY variable to all nodes when running your application.

Hope this helps.


On May 30, 2008, at 1:24 PM, Cally K wrote:

hi, I have some problem running DistributedData.cxx ( it is a VTK file ) , I need to be able to see the rendering from my computer

I, however have problem running the executable, I loaded both the executabe into 2 machines

and I am accesing it from my computer( DHCP enabled )

after running the following command - I use OpenMPI

mpirun -hostfile myhostfile -np 2 -bynode ./DistributedData

and I keep getting these errors

ERROR: In /home/kalpanak/Installation_Files/VTKProject/VTK/Rendering/ vtkXOpenGLRenderWindow.cxx, line 326
vtkXOpenGLRenderWindow (0x8664438): bad X server connection.


ERROR: In /home/kalpanak/Installation_Files/VTKProject/VTK/Rendering/ vtkXOpenGLRenderWindow.cxx, line 169
vtkXOpenGLRenderWindow (0x8664438): bad X server connection.


[vrc1:27394] *** Process received signal ***
[vrc1:27394] Signal: Segmentation fault (11)
[vrc1:27394] Signal code: Address not mapped (1)
[vrc1:27394] Failing at address: 0x84
[vrc1:27394] [ 0] [0xffffe440]
[vrc1:27394] [ 1] ./ DistributedData(_ZN22vtkXOpenGLRenderWindow20GetDesiredVisualInfoEv +0x229) [0x8227e7d] [vrc1:27394] [ 2] ./ DistributedData(_ZN22vtkXOpenGLRenderWindow16WindowInitializeEv +0x340) [0x8226812] [vrc1:27394] [ 3] ./ DistributedData(_ZN22vtkXOpenGLRenderWindow10InitializeEv+0x29) [0x82234f9] [vrc1:27394] [ 4] ./ DistributedData(_ZN22vtkXOpenGLRenderWindow5StartEv+0x29) [0x82235eb] [vrc1:27394] [ 5] ./ DistributedData(_ZN15vtkRenderWindow14DoStereoRenderEv+0x1a) [0x82342ac] [vrc1:27394] [ 6] ./ DistributedData(_ZN15vtkRenderWindow10DoFDRenderEv+0x427) [0x8234757] [vrc1:27394] [ 7] ./ DistributedData(_ZN15vtkRenderWindow10DoAARenderEv+0x5b7) [0x8234d19] [vrc1:27394] [ 8] ./DistributedData(_ZN15vtkRenderWindow6RenderEv +0x690) [0x82353b4] [vrc1:27394] [ 9] ./ DistributedData(_ZN22vtkXOpenGLRenderWindow6RenderEv+0x52) [0x82245e2]
[vrc1:27394] [10] ./DistributedData [0x819e355]
[vrc1:27394] [11] ./ DistributedData(_ZN16vtkMPIController19SingleMethodExecuteEv+0x1ab) [0x837a447]
[vrc1:27394] [12] ./DistributedData(main+0x180) [0x819de78]
[vrc1:27394] [13] /lib/libc.so.6(__libc_start_main+0xe0) [0xb79c0fe0]
[vrc1:27394] [14] ./DistributedData [0x819dc21]
[vrc1:27394] *** End of error message ***
mpirun noticed that job rank 0 with PID 27394 on node .... exited on signal 11 (Segmentation fault).


Maybe I am not doing the xforwading properly, but has anyone ever encountered the same problem, it works fine on one pc, and I read the mailing list but I just don't know if my prob is similiar to their, I even tried changing the DISPLAY env


This is what I want to do

my mpirun should run on 2 machines ( A and B ) and I should be able to view the output ( on my PC ),
are there any specfic commands to use.









_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users


--
Jeff Squyres
Cisco Systems

Reply via email to