> are you sure you don't mean that only printf/std:::cout from rank 0 is 
> visible?
I also thought that it might be a visibility issue, thus I opened a file with 
std::ofstream on each rank with the rank id encoded in the filename. Only one 
file ever gets created, though, and it is the one with “0” in the name.

> but I actual fact the other pvservers are fine. Create a sphere and check if 
> it has N pieces.
I did that and visualized it by vtkProcessId. The number of ids indeed matches 
the number of ranks, so I guess nothing fundamental is wrong with the MPI use 
within ParaView. I just can’t fathom why the reader plugin does not run in 
parallel. Just to be sure, I added a call 

MPI_Barrier(MPI_COMM_WORLD);

in RequestData and indeed, ParaView gets stuck there, as apparently the 
collective call is never issued from any rank != 0.

Michael

_______________________________________________
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Search the list archives at: http://markmail.org/search/?q=ParaView

Follow this link to subscribe/unsubscribe:
http://public.kitware.com/mailman/listinfo/paraview

Reply via email to