I built ParaView server with MPI on a HPC cluster with OSMesa support since the
cluster does not have graphics hardware. The MPI compiler used to build
ParaView is mpich2/gnu412x64/1.4-shared. Paraview server built successfully on
the cluster without any error. I can successfully connect to the
driver is up to date with Geforce GTX580 graphics card.
Thanks for any ideas on what could cause this problem!
Hong
From: paraview-boun...@paraview.org [paraview-boun...@paraview.org] on behalf
of Hong Yi [hon...@renci.org]
Sent: Friday, February 22, 2013 12:27
From: paraview-boun...@paraview.org [mailto:paraview-boun...@paraview.org] On
Behalf Of Hong Yi
Sent: Wednesday, February 27, 2013 12:26 PM
To: paraview@paraview.org
Subject: Re: [Paraview] floating point exception error when doing slice filter
(only happen when running pvserver remotely
is solved after I
disabled FPE. Thanks again for your helpful information!
Best,
Hong
From: Burlen Loring [mailto:blor...@lbl.gov]
Sent: Wednesday, February 27, 2013 3:39 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] floating point exception error when doing slice filter
(only
Hello,
I built Paraview server version 3.14.1 on a HPC cluster with all
co-processing-related flags turned on including enabling co-processing, all
adaptors, and plugins, etc., and built Paraview client/server with no MPI on a
local linux machine with co-processing enabled as well. Then I
. At that point, I might have more specific
questions.
Thanks again,
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Thursday, March 21, 2013 1:59 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Live data source for co-processing in Paraview version
3.14.1
Hi Hong,
The live
to the Catalyst Users Guide to be out soon!
Thanks,
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Thursday, March 21, 2013 3:25 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Live data source for co-processing in Paraview version
3.14.1
I'm working on a Catalyst
Bauer [mailto:andy.ba...@kitware.com]
Sent: Friday, April 05, 2013 2:21 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Live data source for co-processing in Paraview version
3.14.1
Hi Hong,
That Catalyst Users Guide is still a couple of weeks away.
To clear up your confusion
appreciated!
Thanks again,
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Saturday, April 06, 2013 12:15 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Live data source for co-processing in Paraview version
3.14.1
Hi,
I inlined the responses.
Andy
On Fri, Apr 5
Hello,
I am working to build an in-situ viz for a phasta simulation. I am able to
build phasta linked to ParaView v3.98.1 coprocessing library and run phasta
simulation linked to coprocessing library to process a python pipeline in-situ
for each time step iteration. In the pipeline, there are
libs. Looks like the only
way to figure out which libraries are missing is through CMake. So I will give
it a try.
Thanks,
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Monday, April 22, 2013 3:22 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] WriteAllImages
Bauer [mailto:andy.ba...@kitware.com]
Sent: Wednesday, April 24, 2013 5:15 PM
To: Hong Yi; paraview@paraview.org
Subject: Re: [Paraview] WriteAllImages fails when doing phasta in-situ viz
linked with ParaView v3.98.1 coprocessing lib
Hi Hong,
Please keep the discussion on the mailing list so
I am running a simulation in-situ linked with ParaView CoProcessing library
with a pipeline that includes slice-Integrate Variables
Filter-ParallelUnstructuredGridWriter so that integrated velocity value for a
slice at each time point can be written out while running the simulation
in-situ.
exports one file for each time step
regardless of the number of processors it runs on.
Thanks for any more information you can provide!
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Thursday, April 25, 2013 7:43 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Output of ParallelUnstructuredGridWriter running on
multiple nodes
I can think of two options for this, the first is modifying the co-processing
script to get the data you want directly from the output without writing it to
disk. Look
so that the simulation code can determine
when it has reached a steady state.
Best regards,
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Tuesday, April 30, 2013 10:36 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Output of ParallelUnstructuredGridWriter running
I observed when I build a pipeline that includes two slice views and export it
for coprocessing that output both views, the coprocessing with such a pipeline
will output two slice views with legend and axis in the output images
overlapped. See attached two images for details. The slices are not
I found the contours of bubbles do not appear in the output image in
coprocessing although the slices appear as expected in the output image. I see
bubble contours when setting up the pipeline in paraview client, so I know
bubbles are there, but bubble contours do not appear in the output image
image in coprocessing in version 3.98.1?
Many thanks for looking into this!
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Tuesday, May 14, 2013 10:00 AM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Overlapped legend and axis in coprocessing when two
views
I've been working on building ParaView on Oak Ridge HPC Titan (Cray HPC system)
for its coprocessing/catalyst libraries to be linked to PHASTA simulation code
running on Titan for in-situ coprocessing as well as live monitoring via
pvserver which are already built and can run successfully on
Burlen and David,
Many thanks for all the useful information. I will try the suggestions and
report back if I run into further issues.
Thanks again,
Hong
From: David E DeMarle [mailto:dave.dema...@kitware.com]
Sent: Friday, June 07, 2013 1:38 PM
To: Burlen Loring
Cc: Hong Yi; paraview
Sorry to spam the list, but I have not received any single email from this list
for about 1.5 weeks, which is really unusual to me. This is just a test email
to see if I can receive it to make sure nothing wrong has happened quietly on
my side of emails...
Thanks,
Hong
and really appreciate any suggestions and comments
you can provide.
Thanks,
Hong
From: paraview-boun...@paraview.org [paraview-boun...@paraview.org] on behalf
of Hong Yi [hon...@renci.org]
Sent: Friday, June 07, 2013 1:58 PM
To: David E DeMarle; Burlen Loring
Cc
I have finally got ParaView built on Titan with the pgi compiler. However, when
I link our simulation code (also built with the pgi compiler and run
successfully on Titan) with those ParaView coprocessing libraries, I got the
following linking error:
When building our simulation code linked to ParaView 3.98 (built with
coprocessing enabled) with CMake, I got the error cannot find
-lvtkPVPythonCatalyst in the final linking stage. I followed Catalyst User
Guide to add the following into CMakeLists.txt to handle coprocessing:
-related static libs) rather than link to shared lib
via -lPhastaAdaptor?
Thanks and best regards,
Hong
From: Andy Bauer [andy.ba...@kitware.com]
Sent: Wednesday, August 28, 2013 12:10 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] cannot find
it.
Thanks,
Hong
From: paraview-boun...@paraview.org [mailto:paraview-boun...@paraview.org] On
Behalf Of Hong Yi
Sent: Wednesday, August 28, 2013 12:44 PM
To: Andy Bauer
Cc: paraview@paraview.org
Subject: Re: [Paraview] cannot find -lvtkPVPythonCatalyst error when doing
coprocessing
Hi Andy
Hi David,
I just started to try superbuild on Titan also. I don't see you set ENABLE_MPI
to be true in your configure script. Could you confirm whether ENABLE_MPI needs
to be set to TRUE in order for ParaView to run on Titan in parallel? Since my
purpose is to link our simulation code (already
have developed.
Any idea on what could cause the linking error?
Thanks,
Hong
From: David E DeMarle [dave.dema...@kitware.com]
Sent: Thursday, August 29, 2013 4:08 PM
To: Vanmoer, Mark W
Cc: Hong Yi; paraview@paraview.org
Subject: Re: [Paraview] Building on Titan
can
do it by directly changing CMakeCache.txt under paraview/src/paraview-build to
force the corresponding flags to be on...
Thanks,
Hong
From: Vanmoer, Mark W [mailto:mvanm...@illinois.edu]
Sent: Friday, August 30, 2013 5:34 PM
To: Hong Yi; David E DeMarle
Cc: paraview@paraview.org
Subject: RE
From: paraview-boun...@paraview.org [paraview-boun...@paraview.org] on behalf
of Hong Yi [hon...@renci.org]
Sent: Friday, August 30, 2013 5:43 PM
To: Vanmoer, Mark W; David E DeMarle
Cc: paraview@paraview.org
Subject: Re: [Paraview] Building on Titan using ParaViewSuperbuild
Thanks for the info
again for all the useful information!
Best regards,
Hong
From: David E DeMarle [dave.dema...@kitware.com]
Sent: Thursday, September 05, 2013 12:29 PM
To: Hong Yi
Cc: Vanmoer, Mark W; paraview@paraview.org
Subject: Re: [Paraview] Building on Titan using
When linking our simulation code (a variant of phasta) to Catalyst in ParaView
version 4.0.1 built on Titan with superbuild, I got the following linking
errors:
-
../../lib/libincompressible.a(itrdrv.f.o): In function `itrdrv_':
itrdrv.f:(.text+0x2add): undefined reference to
: Saturday, September 07, 2013 4:55 AM
To: paraview@paraview.org
Subject: Re: [Paraview] Errors when linking catalyst in ParaView version 4.0.1
to, simulation code on Titan (Hong Yi)
Hi Hong Yi,
I got a similar error. One way to is to find the paths for the mpi
libraries on titan directly - do module
Hi David,
I found FortranCInterface was not built even in the first TOOLS pass. Here are
the errors I am getting from CMakeError.log:
-
/opt/gcc/4.7.2/bin/gfortran CMakeFiles/FortranCInterface.dir/main.F.o
CMakeFiles/FortranCInterface.dir/call_sub.f.o
within CMake so that it can build FortranCInterface and set corresponding
flags correctly. Let me know if anybody has some pointers on how to do that
within CMake.
Thanks,
Hong
From: Vanmoer, Mark W [mailto:mvanm...@illinois.edu]
Sent: Tuesday, September 10, 2013 2:47 PM
To: Hong Yi; David E
,
Hong
From: Vanmoer, Mark W [mailto:mvanm...@illinois.edu]
Sent: Tuesday, September 10, 2013 6:43 PM
To: Hong Yi; Vanmoer, Mark W; David E DeMarle
Cc: paraview@paraview.org
Subject: RE: [Paraview] Building on Titan using ParaViewSuperbuild
Hi Hong and David,
I was able to get FortranCInterface
...@kitware.com]
Sent: Thursday, September 12, 2013 5:03 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] Statically linking catalyst to fortran simulation code
on Titan
I preset these config flags to make sure I got the static MPI libs.
list(APPEND PARAVIEW_OPTIONS -DBUILD_SHARED_LIBS:BOOL
I am able to successfully build our Fortran simulation code as well as Catalyst
Fortran example code linked to catalyst (built with superbuild), but the
resulting executable is not static, but have some dependencies on shared
libraries such as libmpichf90_gnu_47.so.1, limxpmem.so.1, etc., which
From: Andy Bauer [andy.ba...@kitware.com]
Sent: Thursday, September 12, 2013 6:07 PM
To: David E DeMarle
Cc: Hong Yi; paraview@paraview.org
Subject: Re: [Paraview] Statically linking catalyst to fortran simulation code
on Titan
If I remember correctly, FindMPI.cmake is trying to set the language
, September 13, 2013 4:08 PM
To: Hong Yi
Cc: Andy Bauer; paraview@paraview.org
Subject: Re: [Paraview] Statically linking catalyst to fortran simulation code
on Titan
Great news on getting it to compile and link!
Pretty sure that the statically built python and paraview python modules need
Just a quick note that I have got coprocessing working with our simulation code
on Titan with expected image output. Thanks again for all your help!
From: paraview-boun...@paraview.org [mailto:paraview-boun...@paraview.org] On
Behalf Of Hong Yi
Sent: Monday, September 16, 2013 12:07 PM
Hello,
I set up a pipeline that used a custom filter I developed and exported it as a
python script for coprocessing. I have made sure the static library for my
custom filter is built and available in paraview-build/lib directory, and I
have also linked this custom filter static library along
From: David E DeMarle [mailto:dave.dema...@kitware.com]
Sent: Thursday, September 26, 2013 1:28 PM
To: Andy Bauer
Cc: Hong Yi; paraview@paraview.org
Subject: Re: [Paraview] My own filter as part of pipeline for coprocessing
raises name not defined error
Utkarsh pointed out that you still need
, and did make clean followed by make to rebuild
PhastaAdaptor lib. Is there something obvious that I did not do correctly?
Thanks for any inputs!
Hong
From: Andy Bauer [andy.ba...@kitware.com]
Sent: Friday, September 27, 2013 3:00 PM
To: Hong Yi
Cc: David E DeMarle
, 2013 2:19 PM
To: Hong Yi
Cc: David E DeMarle; paraview@paraview.org
Subject: Re: [Paraview] My own filter as part of pipeline for coprocessing
raises name not defined error
Hi Hong,
I don't think you want to do VTK Python wrapping of your class. What you want
is to have the proxy to the filter
this issue.
Thanks,
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Thursday, October 03, 2013 10:19 AM
To: Hong Yi
Cc: David E DeMarle; paraview@paraview.org
Subject: Re: [Paraview] My own filter as part of pipeline for coprocessing
raises name not defined error
Hi Hong,
There won't
Hi Andy,
A quick related question to confirm with you: for the Catalyst live mode to
work, is pvserver process needed in between for the PV client to communicate
with Catalyst enabled simulation? In other words, do I need to run two jobs on
HPC, one is Catalyst enabled simulation, the other is
I have done several simulation runs linked with ParaView Catalyst for in-situ
visualization on Titan with 18k cores and have the following
observations/questions hoping to seek input from this list.
1. It appears IceT-based image compositing for 18k cores takes such a
long time that it
...@kitware.com]
Sent: Tuesday, November 26, 2013 10:57 AM
To: Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] In-situ file/image output on Titan with 18k cores
Hi Hong,
Can you describe the type of view you're trying to output? If I remember
correctly it was volume rendering of an image data
, Kenneth [kmo...@sandia.gov]
Sent: Wednesday, November 27, 2013 12:06 PM
To: Berk Geveci; Hong Yi
Cc: paraview@paraview.org
Subject: Re: [Paraview] In-situ file/image output on Titan with 18k cores
I also wonder if you are trying to render anything transparent (volume
rendering, opacity 1
be great if you could give
me some pointers on this such as some instrumentation tools I could leverage,
etc.
Best regards,
Hong
From: Berk Geveci [berk.gev...@kitware.com]
Sent: Wednesday, November 27, 2013 3:01 PM
To: Hong Yi
Cc: paraview@paraview.org
Subject
. Now ParaView built
successfully in the updated software environment on Titan.
Regards,
Hong
From: ParaView [mailto:paraview-boun...@paraview.org] On Behalf Of Hong Yi
Sent: Tuesday, February 25, 2014 5:40 PM
To: paraview@paraview.org
Subject: [Paraview] undefined reference to `gzopen64' when
In our in-situ run linking our simulation to ParaView Catalyst, I found an
empty line was always printed from each core during Catalyst initialization (in
coprocessorinitializewithpython() call). Although this is not a big deal, it
becomes annoying when running on large number of cores. I
Hi Andy,
Done, the issue ID is 0014781. I did not find an input to assign it to you, but
I indicated it in the description. I did assign the project to be Catalyst.
Thanks,
Hong
From: Andy Bauer [mailto:andy.ba...@kitware.com]
Sent: Thursday, June 05, 2014 2:44 PM
To: Hong Yi
Cc: paraview
55 matches
Mail list logo