Hi Ufuk, If you create a vtkCPythonScriptPipeline, when you initialize it with the script file name (which has to be done on each process) everything will be taken care of with respect to broadcasting the file contents from process 0 to the others. We aren't sophisticated enough to parse the Python script to see if it imports other scripts that are not part of ParaView (e.g. paraview.simple) or Python (e.g. sys). That is why I recommended the first approach as opposed to the second approach above. Depending on the compute platform and how many MPI processes are in the run the difference may be negligible but having 100K processes or more trying to access the same file can seriously slow down an HPC machine.
Cheers, Andy On Tue, May 16, 2017 at 8:24 AM, Ufuk Utku Turuncoglu (BE) < u.utku.turunco...@be.itu.edu.tr> wrote: > Thanks Andy. That is exactly what i am looking for. The broadcasting > mechanism is not clear to me yet. Do i need to broadcast only the file > names? Anyway, i will try to implement it and see what is going on there. > > Thanks again, > Regards, > > --ufuk > > > On 16/05/2017 14:58, Andy Bauer wrote: > > Hi Ufuk, > > Unless I'm not understanding your question correctly, I think you can get > what you want by adding in multiple vtkCPPythonScriptPipelines to your > vtkCPProcessor object in your adaptor. Alternatively if you want to have a > single, master Catalyst script handling other Catalyst scripts you can do > something like the following: > ================ > import script_a > import script_b > import script_c > > def RequestDataDescription(datadescription): > script_a.RequestDataDescription(datadescription) > script_b.RequestDataDescription(datadescription) > script_c.RequestDataDescription(datadescription) > > def DoCoProcessing(datadescription): > script_a.DoCoProcessing(datadescription) > script_b.DoCoProcessing(datadescription) > script_c.DoCoProcessing(datadescription) > =================== > > The first way is the recommended way though as that should be more > efficient by having process 0 read the scripts and broadcasting the script > contents to the other processes for use. The second method will only do > that for the master script. > > Please let me know if this doesn't answer your question. > > Cheers, > Andy > > On Tue, May 16, 2017 at 5:46 AM, Ufuk Utku Turuncoglu (BE) < > u.utku.turunco...@be.itu.edu.tr> wrote: > >> Hi All, >> >> I just wonder that is it possible to trigger multiple visualization >> pipeline in the same time with co-processing. The co-processing script >> generator plugin mainly outputs only single pipeline at a time and that is >> fine but what about combining multiple Python script (generated by plugin) >> using higher level Python script to trigger multiple pipelines. So, i think >> that this will be much efficient way to look at different part of the data >> without writing to the disk. I am not sure but somebody else might do it >> before. >> >> Regards, >> >> --ufuk >> >> _______________________________________________ >> Powered by www.kitware.com >> >> Visit other Kitware open-source projects at >> http://www.kitware.com/opensource/opensource.html >> >> Please keep messages on-topic and check the ParaView Wiki at: >> http://paraview.org/Wiki/ParaView >> >> Search the list archives at: http://markmail.org/search/?q=ParaView >> >> Follow this link to subscribe/unsubscribe: >> http://public.kitware.com/mailman/listinfo/paraview >> > > >
_______________________________________________ Powered by www.kitware.com Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html Please keep messages on-topic and check the ParaView Wiki at: http://paraview.org/Wiki/ParaView Search the list archives at: http://markmail.org/search/?q=ParaView Follow this link to subscribe/unsubscribe: http://public.kitware.com/mailman/listinfo/paraview