[Paraview] crashing on megapoint data

2012-10-31 Thread Kashiwa, Bucky
I'm running the vanilla website release of version 3.14.1 on the OS

RedHat Enterprise Linux Server release 5.5 (Tikanga) Kernel 2.6.18-108chaos on 
an x86_64

This code will display my ~120,000,000 points using cosmo format input, but 
only with a constant color (gray).  If I try to add color, paraview sucks up 
all of the available memory (~22GB) and crashes.  This happens with either 
PointSprites or with EyeDomeLighting View.  (Not sure if it happens with plain 
points or not...)

If you have any suggestions on how to mitigate the crash, I will gladly give 
them a try.

Thanks very much.  Bucky Kashiwa

=
Bucky Kashiwa PhD, PE   Post: MS B216, Los Alamos, NM  87545   
=




___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Follow this link to subscribe/unsubscribe:
http://www.paraview.org/mailman/listinfo/paraview


[Paraview] PointSprites don't appear from Mac OS X pvserver

2012-12-07 Thread Kashiwa, Bucky
I've been using both 3.14.1 and 3.98.0 with the following behavior on Mac OS 
10.6.8:

Paraview in Mac-client-only mode (locally) displays Point and PointSprite data 
just fine.

Paraview in Mac-client+Unix-server mode displays Point data, but not 
PointSprite data (the PointSprites are invisible).

Paraview in Unix-client displayed via X11 on the Mac displays Point data, but 
not PointSprite data (invisible).

Paraview in Unix-client displayed via X11 on a Unix box displays both Point and 
PointSprite just fine.

Paraview in Unix-client+Unix-server mode displayed on a Unix box displays both 
Point and PointSprite just fine.

It appears that remote PointSprites are invisible on Mac OSX, only local ones 
seem to show up on the display.

Any ideas?

Thanks much.
=
Bucky Kashiwa PhD, PE   Post: MS B216, Los Alamos, NM  87545   
  Ofc: TA3-SM123-RM276 Email: b...@lanl.gov, kash...@qwest.net  
Voice: 505-667-8812  Fax: 505-665-5926   
  Home: 505-988-7332   Cell: 505-795-5581  
=




___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Follow this link to subscribe/unsubscribe:
http://www.paraview.org/mailman/listinfo/paraview


[Paraview] Material Interface Filter

2013-04-09 Thread Kashiwa, Bucky
So I'm trying to use the Material Interface Filter:

http://www.paraview.org/Wiki/ParaView/Users_Guide/List_of_filters#Material_Interface_Filter

and have carefully created XML datafiles using vtkNonOverlappingAMR (*.vth) 
format.  Unfortunately the v3.98.1 gui does not furnish the Properties option 
'Down Convert Volume Fractions' when reading in the datafiles in *.vth format.  
 (In fact the Properties menu has only the 'Default Number of Levels' slider, 
and no cell data selector, and no 'Down Convert..' selector.)  Consequently the 
Material Interface Filter has an empty choice for 'Select Material Fraction 
Arrays', and so the filter does not know what to do.  Alas, the only format 
that seems to generate the 'Down Convert..' option is *.spct, which, of course, 
we do not have in this case.

Is there another filter that I need to run prior to doing 'Material Interface 
Filter', or am I just plain out of luck here?

Thanks for your very capable help.
===73
Bucky Kashiwa PhD, PE   Post: MS B216, Los Alamos, NM  87545   
  Ofc: TA3-SM123-RM276 Email: b...@lanl.govmailto:b...@lanl.gov, 
kash...@qwest.netmailto:kash...@qwest.net  
Voice: 505-667-8812  Fax: 505-665-5926  
  Home: 505-988-7332   Cell: 505-795-5581  
===73

___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Follow this link to subscribe/unsubscribe:
http://www.paraview.org/mailman/listinfo/paraview


[Paraview] unix GUI build with 3.98.0 or 3.98.1

2013-04-11 Thread Kashiwa, Bucky
Dear ParaView Friends:

I'm trying to build the ParaView GUI with no success (unix OS).  The make fails 
in pqComponents with a message (copied below with system info, and ccmake 
command):

'No rule to make target 
`Qt/Components/../../../../../usr/projects/pv_dev/ParaView-3.98.1-git/Qt/Components/Resources/XML/Placeholder.xml'

which has me miffed because the file Placeholder.xml is present at that 
location.  I have tried both 3.98.0 and 3.98.1 sources downloaded from your 
website, as well as 3.98.1 obtained via git (users space).

Any thoughts or repairs to my procedure would make my day.  Thanks very much.


---


Make error  sys info:

[ 81%] Built target pqWidgets
[ 84%] Built target pqCore
make[2]: *** No rule to make target 
`Qt/Components/../../../../../usr/projects/pv_dev/ParaView-3.98.1-git/Qt/Components/Resources/XML/Placeholder.xml',
 needed by `Qt/Components/qrc_pqExtraResources.cxx'.  Stop.
make[1]: *** [Qt/Components/CMakeFiles/pqComponents.dir/all] Error 2
make: *** [all] Error 2
ml-fey{585}$ gcc --version
gcc (GCC) 4.7.0
Copyright (C) 2012 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

ml-fey{586}$ /usr/projects/pv_dev/Qt-4.8.4/bin/qm
qmake* qmlplugindump* qmlviewer*
ml-fey{586}$ /usr/projects/pv_dev/Qt-4.8.4/bin/qmake --version
QMake version 2.01a
Using Qt version 4.8.4 in /usr/projects/pv_dev/Qt-4.8.4/lib
ml-fey{587}$
ml-fey{587}$
ml-fey{587}$ uname -a
Linux ml-fey.lanl.gov 2.6.32-220.23.1.1chaos.ch5.x86_64 #1 SMP Tue Jun 19 
17:16:17 PDT 2012 x86_64 x86_64 x86_64 GNU/Linux
ml-fey{588}$
ml-fey{588}$
ml-fey{588}$ ccmake2.8 --version
ccmake version 2.8.10.2
ml-fey{589}$
ml-fey{589}$
ml-fey{589}$

---



Ccmake command:

module load gcc (gets v 4.7.0 and correct LD_LIBRARY_PATH)

ccmake \
-D BUILD_SHARED_LIBS:BOOL=ON \
-D BUILD_TESTING:BOOL=OFF \
-D CMAKE_BUILD_TYPE:STRING=Release \
-D CMAKE_C_COMPILER:FILEPATH=/usr/projects/hpcsoft/moonlight/gcc/4.7.0/bin/gcc \
-D 
CMAKE_CXX_COMPILER:FILEPATH=/usr/projects/hpcsoft/moonlight/gcc/4.7.0/bin/g++ \
-D CMAKE_C_FLAGS:STRING=-fPIC \
-D CMAKE_CXX_FLAGS:STRING=-fPIC \
-D CMAKE_INSTALL_PREFIX:PATH=/usr/projects/pv_dev/PV-3.98.1-Client \
-D OPENGL_gl_LIBRARY:FILEPATH=/usr/lib64/libGL.so \
-D OPENGL_glu_LIBRARY:FILEPATH=/usr/lib64/libGLU.so \
-D OPENGL_INCLUDE_DIR:PATH=/usr/include \
-D PARAVIEW_BUILD_QT_GUI:BOOL=ON \
-D PARAVIEW_ENABLE_COPROCESSING:BOOL=ON \
-D QT_QMAKE_EXECUTABLE:FILEPATH=/usr/projects/pv_dev/Qt-4.8.4/bin/qmake \
-D 
VTK_MPIRUN_EXE:FILEPATH=/usr/projects/hpcsoft/moonlight/openmpi/1.4.5-gcc-4.7/bin/mpiexec
 \
-D VTK_USE_X:BOOL=ON \
 /usr/projects/pv_dev/ParaView-3.98.1-git


===73
Bucky Kashiwa PhD, PE   Post: MS B216, Los Alamos, NM  87545   
  Ofc: TA3-SM123-RM276 Email: b...@lanl.gov, kash...@qwest.net  
Voice: 505-667-8812  Fax: 505-665-5926  
  Home: 505-988-7332   Cell: 505-795-5581  
===73

___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Follow this link to subscribe/unsubscribe:
http://www.paraview.org/mailman/listinfo/paraview


[Paraview] Qt-4.7.4 vs Qt-4.8.4

2013-04-17 Thread Kashiwa, Bucky
FYI it appears that building the v3.98.1 client fails with Qt-4.7.4 in the 
routine at

./Qt/Components/Resources/UI/pqApplicationOptions.ui

which uses 'alignment' in two locations:

   item row=8 column=3 alignment=Qt::AlignVCenter

   item row=8 column=1 alignment=Qt::AlignLeft

Qt-4.8.4 does not complain, so it might be better for ccmake to mention this 
Qt-version sensitivity.  (Not sure if it matters, but this is with OSMESA 
turned on…)

===73
Bucky Kashiwa PhD, PE   Post: MS B216, Los Alamos, NM  87545   
  Ofc: TA3-SM123-RM276 Email: b...@lanl.gov, kash...@qwest.net  
Voice: 505-667-8812  Fax: 505-665-5926  
  Home: 505-988-7332   Cell: 505-795-5581  
===73

___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Follow this link to subscribe/unsubscribe:
http://www.paraview.org/mailman/listinfo/paraview


[Paraview] reading classic *.cosmo files in versions greater than 3

2015-06-30 Thread Kashiwa, Bucky
We have been very happily reading classical *.cosmo formatted datafiles
using version 3.98.1, but when reading the same files in subsequent
versions (4.0, 4.1, 4.2, 4.3) the data get read, but wrongly interpreted.
I did the diff of the PCosmoReader.h between v3.98.1 and v4.3.1 which
suggested to me that the last number in the cosmo data (called ¹tag¹) is
now int64_t rather than integer.  So if we write the cosmo data with type
MPI_2INTEGER (~ fortran kind=SELECTED_INT_KIND(15)) then the data are read
correctly by the v4.3.1 cosmo reader (without byte swap).

If I select the ŒAdaptive cosmo reader¹ from the reader list, it displays
a button for the ¹Tag Size¹ which can be 32-bit or 64-bit.  This button
does not appear on the ŒCosmology Files¹ choice from the reader list.

Could it be that the build system is somehow failing to place the ¹Tag
Size¹ button on the ŒCosmology Files¹ Properties panel?  If so, it would
be wonderful to get it put there so that we can have the choice of Tag
Size on the very useful .cosmo datafiles.  If not, please let us know if
we are stuck with a 64-bit Tag Size.

Thanks very much.
==73
Bucky Kashiwa PhD, PE   Post: MS B216, Los Alamos, NM  87545   
  Ofc: TA3-SM123-RM276 Email: b...@lanl.gov, kash...@qwest.net  
Voice: 505-667-8812  Fax: 505-665-5926  
  Home: 505-988-7332   Cell: 505-795-5581  
===73


___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Search the list archives at: http://markmail.org/search/?q=ParaView

Follow this link to subscribe/unsubscribe:
http://public.kitware.com/mailman/listinfo/paraview


[Paraview] how to build 4.3.1 with CosmoTools

2015-07-28 Thread Kashiwa, Bucky
I¹m trying to build the ParaView-v4.3.1-source standard release, with
cosmotools enabled.  The directory is present at

ParaView-v4.3.1/ParaViewCore/VTKExtensions/CosmoTools

and useful looking source code, but it seems to be unrecognized.  Below is
pasted my ccmake command, and after that is the ccmake error message.  Any
help would be very welcome.  Thanks much, b.


ccmake2.8 \
-D BUILD_SHARED_LIBS:BOOL=ON \
-D BUILD_TESTING:BOOL=OFF \
-D CMAKE_BUILD_TYPE:STRING=Release \
-D 
CMAKE_C_COMPILER:FILEPATH=/usr/projects/hpcsoft/toss2/common/gcc/4.7.2/bin/
gcc \
-D 
CMAKE_CXX_COMPILER:FILEPATH=/usr/projects/hpcsoft/toss2/common/gcc/4.7.2/bi
n/g++ \
-D CMAKE_C_FLAGS:STRING=-fPIC \
-D CMAKE_CXX_FLAGS:STRING=-fPIC \
-D 
CMAKE_Fortran_COMPILER:FILEPATH=/usr/projects/hpcsoft/toss2/common/gcc/4.7.
2/bin/gfortran \
-D 
COSMOTOOLS_INCLUDE_DIR:PATH=../ParaView-v4.3.1-source/ParaViewCore/VTKExten
sions/CosmoTools \
-D FFMPEG_INCLUDE_DIR:PATH=/usr/projects/pv_dev/FFmpeg-1.2/include \
-D 
FFMPEG_avcodec_LIBRARY:FILEPATH=/usr/projects/pv_dev/FFmpeg-1.2/lib/libavco
dec.so \
-D 
FFMPEG_avformat_LIBRARY:FILEPATH=/usr/projects/pv_dev/FFmpeg-1.2/lib/libavf
ormat.so \
-D 
FFMPEG_avutil_LIBRARY:FILEPATH=/usr/projects/pv_dev/FFmpeg-1.2/lib/libavuti
l.so \
-D 
FFMPEG_dc1394_LIBRARY:FILEPATH=/usr/projects/pv_dev/FFmpeg-1.2/lib/libavdev
ice.so \
-D 
FFMPEG_dts_LIBRARY:FILEPATH=/usr/projects/pv_dev/FFmpeg-1.2/lib/libavformat
.so \
-D 
FFMPEG_gsm_LIBRARY:FILEPATH=/usr/projects/pv_dev/FFmpeg-1.2/lib/libavcodec.
so \
-D 
FFMPEG_swscale_LIBRARY:FILEPATH=/usr/projects/pv_dev/FFmpeg-1.2/lib/libswsc
ale.so \
-D GENERIC_IO_INCLUDE_DIR:PATH=/usr/include/gio-unix-2.0 \
-D GENERIC_IO_LIBRARIES:FILEPATH=/usr/lib64/libgio-2.0.so \
-D 
OPENGL_gl_LIBRARY:FILEPATH=/usr/projects/pv_dev/OSMesa-9.0.1/lib/libGL.so \
-D OPENGL_glu_LIBRARY:FILEPATH= \
-D OPENGL_INCLUDE_DIR:PATH=/usr/projects/pv_dev/OSMesa-9.0.1/include \
-D OSMESA_INCLUDE_DIR:PATH=/usr/projects/pv_dev/OSMesa-9.0.1/include \
-D 
OSMESA_LIBRARY:FILEPATH=/usr/projects/pv_dev/OSMesa-9.0.1/lib/libOSMesa.so
\
-D PARAVIEW_BUILD_QT_GUI:BOOL=ON \
-D PARAVIEW_BUILD_CATALYST_ADAPTORS:BOOL=ON \
-D PARAVIEW_ENABLE_COPROCESSING:BOOL=ON \
-D PARAVIEW_ENABLE_COSMOTOOLS:BOOL=ON \
-D PARAVIEW_ENABLE_FFMPEG:BOOL=ON \
-D PARAVIEW_ENABLE_PYTHON:BOOL=ON \
-D PARAVIEW_USE_MPI:BOOL=ON \
-D PYTHON_EXECUTABLE:FILEPATH=/usr/projects/pv_dev/Python-2.7.4/bin/python
\
-D 
PYTHON_INCLUDE_DIR:PATH=/usr/projects/pv_dev/Python-2.7.4/include/python2.7
 \
-D 
PYTHON_LIBRARY:FILEPATH=/usr/projects/pv_dev/Python-2.7.4/lib/libpython2.7.
so \
-D QT_QMAKE_EXECUTABLE:FILEPATH=/usr/projects/pv_dev/Qt-4.8.4/bin/qmake \
-D 
VTK_MPIRUN_EXE:FILEPATH=/usr/projects/hpcsoft/lightshow/openmpi/1.6.5-gcc-4
.7/bin/mpiexec \
-D VTK_OPENGL_HAS_OSMESA:BOOL=ON \
-D VTK_USE_X:BOOL=ON \
-D CMAKE_INSTALL_PREFIX:PATH=/usr/projects/pv_dev/PV-4.3.1-FX \
 ../ParaView-v4.3.1-source







 CMake Error at VTK/CMake/FindPackageHandleStandardArgs.cmake:97 (MESSAGE):
   Could NOT find CosmoTools (missing: COSMOTOOLS_LIBRARIES)
 Call Stack (most recent call first):
   VTK/CMake/FindPackageHandleStandardArgs.cmake:288
(_FPHSA_FAILURE_MESSAGE)
   CMake/FindCosmoTools.cmake:21 (find_package_handle_standard_args)
   ParaViewCore/VTKExtensions/CosmoTools/CMakeLists.txt:6 (find_package)









===73
Bucky Kashiwa PhD, PE   Post: MS B216, Los Alamos, NM  87545   
  Ofc: TA3-SM123-RM276 Email: b...@lanl.gov, kash...@qwest.net  
Voice: 505-667-8812  Fax: 505-665-5926  
  Home: 505-988-7332   Cell: 505-795-5581  
===73


___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Search the list archives at: http://markmail.org/search/?q=ParaView

Follow this link to subscribe/unsubscribe:
http://public.kitware.com/mailman/listinfo/paraview


Re: [Paraview] ... not yet supported for more than 2147483647 bytes.

2016-12-19 Thread Kashiwa, Bucky
Andy, Ashton:  I will bring up v5.2 to see if it works for me.  Thanks, b.

===73
<>Bucky Kashiwa PhD, PE  <> Post: MS B216, Los Alamos, NM  87545   <>
<>  Ofc: TA3-SM123-RM276 <>Email: b...@lanl.gov<mailto:b...@lanl.gov>, 
kash...@qwest.net<mailto:kash...@qwest.net>  <>
<>Voice: 505-667-8812<>  Fax: 505-665-5926  <>
 <> Home: 505-988-7332  <> Cell: 505-795-5581  <>
===73

From: Andy Bauer <andy.ba...@kitware.com<mailto:andy.ba...@kitware.com>>
Date: Monday, December 19, 2016 at 3:17 PM
To: andrealphus <andrealp...@gmail.com<mailto:andrealp...@gmail.com>>
Cc: Bucky Kashiwa <b...@lanl.gov<mailto:b...@lanl.gov>>, 
"ParaView@ParaView.org<mailto:ParaView@ParaView.org>" 
<ParaView@paraview.org<mailto:ParaView@paraview.org>>
Subject: Re: [Paraview] ... not yet supported for more than 2147483647 bytes.

There are two parts to this issue. The first is that that vtkMPICommunicator 
for PV 4.3.1 won't communicate data that is over 2^31 bytes of data. This is 
fixed in PV 5.2. The other issue is due to MPI having a limit of 2^31 objects 
to be communicated in a single shot. This is MPI's API in that the count for 
objects that are typically sent/received is an int. See  
http://www.mpich.org/static/docs/v3.1/www3/MPI_Send.html for example.

On Mon, Dec 19, 2016 at 4:59 PM, andrealphus 
<andrealp...@gmail.com<mailto:andrealp...@gmail.com>> wrote:
That is a 32 bit error, from trying to index something with more than
(2^32)/2 elements or indices. Are you using any custom
libraries/packages/modules which might not be 64 bit compliant? Are
you sure you built a 64 bit version (check your gcc -v).

-ashton

On Mon, Dec 19, 2016 at 1:32 PM, Kashiwa, Bucky 
<b...@lanl.gov<mailto:b...@lanl.gov>> wrote:
> On Linux using ParaView version 4.3.1 built with OSMesa-9.0.1, OpenMPI,
> etc.  Running pvserver with 12 PEs, client-server mode, Standard Release
> ParaView-4.3.1-Linux-64bit client.  With large point data (>2Gb) we get
> this error, upon trying to display a large number of points:
>
> Generic Warning: In
> ../ParaView-v4.3.1-source/VTK/Parallel/MPI/vtkMPICommunicator.cxx,
> line 194
>
> This operation not yet supported for more than 2147483647 bytes.
>
> Our CMakeCache.txt is attached, in case it may provide helpful clues.
>
> Thanks much.  B. Kashiwa
>
>
>
> ___
> Powered by www.kitware.com<http://www.kitware.com>
>
> Visit other Kitware open-source projects at 
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the ParaView Wiki at: 
> http://paraview.org/Wiki/ParaView
>
> Search the list archives at: http://markmail.org/search/?q=ParaView
>
> Follow this link to subscribe/unsubscribe:
> http://public.kitware.com/mailman/listinfo/paraview
>
___
Powered by www.kitware.com<http://www.kitware.com>

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Search the list archives at: http://markmail.org/search/?q=ParaView

Follow this link to subscribe/unsubscribe:
http://public.kitware.com/mailman/listinfo/paraview

___
Powered by www.kitware.com

Visit other Kitware open-source projects at 
http://www.kitware.com/opensource/opensource.html

Please keep messages on-topic and check the ParaView Wiki at: 
http://paraview.org/Wiki/ParaView

Search the list archives at: http://markmail.org/search/?q=ParaView

Follow this link to subscribe/unsubscribe:
http://public.kitware.com/mailman/listinfo/paraview


Re: [Paraview] [EXTERNAL] Re: ... not yet supported for more than 2147483647 bytes.

2017-06-30 Thread Kashiwa, Bucky
Hey Alan:  We originally encountered the Generic Warning in v4.3.1; Andy 
suggested trying 5.2 (see below).  Subsequently (see below) we have tried v5.2, 
v5.3, and v5.4 - all of which behave much the same with large point numbers.  I 
believe that Andy’s assessment is correct: this is an MPI issue.  However, why 
ParaView should find it necessary to pass all of the point data, all at once, 
is kinda baffling.  So maybe you have pointed to something to study: has the 
4.* bug really been fixed?  Especially in the case of point data (rather than 
grid data, which, in our case, is always considerably smaller).  Thanks for 
writing, and thinking about the issue.  Cheers, b.

===73
<>Bucky Kashiwa PhD, PE  <> Post: MS B216, Los Alamos, NM  87545   <>
<>  Ofc: TA3-SM123-RM276 <>Email: b...@lanl.gov<mailto:b...@lanl.gov>, 
kash...@qwest.net<mailto:kash...@qwest.net>  <>
<>Voice: 505-667-8812<>  Fax: 505-665-5926  <>
 <> Home: 505-988-7332  <> Cell: 505-795-5581  <>
===73

From: "Scott, W Alan" <wasc...@sandia.gov<mailto:wasc...@sandia.gov>>
Date: Thursday, June 29, 2017 at 11:31 AM
To: Bucky Kashiwa <b...@lanl.gov<mailto:b...@lanl.gov>>, "Bauer, Andy (External 
Contacts)" <andy.ba...@kitware.com<mailto:andy.ba...@kitware.com>>, andrealphus 
<andrealp...@gmail.com<mailto:andrealp...@gmail.com>>
Cc: "ParaView@ParaView.org<mailto:ParaView@ParaView.org>" 
<ParaView@paraview.org<mailto:ParaView@paraview.org>>
Subject: RE: [EXTERNAL] Re: [Paraview] ... not yet supported for more than 
2147483647 bytes.

I notice you are running ParaView 4.3.1?  We had a bug in the earlier 4.*.* 
versions where we were passing massive amounts of information from all 
processes to all processes on the state of the status bar, bottom of the 
screen.  We stopped doing that.  I wonder if this could be an issue with huge 
numbers of points?

Try PV 5.4.0?  (Or possibly wait for 5.4.1, out Real Soon Now?)

Alan

From: ParaView [mailto:paraview-boun...@paraview.org] On Behalf Of Kashiwa, 
Bucky
Sent: Thursday, June 29, 2017 10:53 AM
To: Bauer, Andy (External Contacts) 
<andy.ba...@kitware.com<mailto:andy.ba...@kitware.com>>; andrealphus 
<andrealp...@gmail.com<mailto:andrealp...@gmail.com>>
Cc: ParaView@ParaView.org<mailto:ParaView@ParaView.org>
Subject: [EXTERNAL] Re: [Paraview] ... not yet supported for more than 
2147483647 bytes.

Andy, Ashton:  We have now tried versions 5.2, 5.3, and 5.4.  With large point 
data sets we still have the same Generic Warning cited below (followed by 
ERRORS that cause PV to hang).  I reckon that we are hitting the MPI wall 
associated with 2^31 items communicated at once - as per Andy’s note below.

All of our recent work has been using the downloadable binaries from 
paraview.org, for both the client and the server.  We see the same behavior 
using the MacOS client to Linux server, and Linux client to Linux server.

This MPI shortcoming is a serious showstopper for us so we are going to have to 
find a remedy.

The first question is: why should either the client or the server think that 
more than 2^31 items need to be communicated in the first place?  On the 
surface, this seems to be unreasonable.

To be clear, there is no problem opening and reading the datafiles, and 
displaying the outline view.  The warning appears when we try to display the 
point view, when the number of points is too large.  If we take a slice that 
reduces the number of points to below some threshold, then the point view 
display is okay.  As an example, consider a .cosmo64 formatted point data file. 
 There are 36 Bytes of data per point.  We can display ~28,000,000 points and 
~35,500,000 points just fine.  At 45,500,000 points (=1.638 GBytes) the Generic 
Warning gets thrown.  This seems to be independent of the number of nodes and 
PEs used by the server, and whether the view RenderView or EyeDomeLighting.   
We also write/read .vtm format that behaves in a similar fashion: slices will 
display okay until the slice is thick enough that there are too many points in 
the image.

I have tried doing MPI on the client, which seems to have no effect on the 
foregoing limitation on the number of displayable points.  Please let me know 
if you can think of other switches that can be thrown, that may shed some more 
light on the issue, or if you are aware of a forthcoming repair.  Thanks very 
much, b.


===73
<>Bucky Kashiwa PhD, PE  <> Post: MS B216, Los Alamos, NM  87545   <>
<>  Ofc: T

Re: [Paraview] ... not yet supported for more than 2147483647 bytes.

2017-06-29 Thread Kashiwa, Bucky
Andy, Ashton:  We have now tried versions 5.2, 5.3, and 5.4.  With large point 
data sets we still have the same Generic Warning cited below (followed by 
ERRORS that cause PV to hang).  I reckon that we are hitting the MPI wall 
associated with 2^31 items communicated at once - as per Andy’s note below.

All of our recent work has been using the downloadable binaries from 
paraview.org, for both the client and the server.  We see the same behavior 
using the MacOS client to Linux server, and Linux client to Linux server.

This MPI shortcoming is a serious showstopper for us so we are going to have to 
find a remedy.

The first question is: why should either the client or the server think that 
more than 2^31 items need to be communicated in the first place?  On the 
surface, this seems to be unreasonable.

To be clear, there is no problem opening and reading the datafiles, and 
displaying the outline view.  The warning appears when we try to display the 
point view, when the number of points is too large.  If we take a slice that 
reduces the number of points to below some threshold, then the point view 
display is okay.  As an example, consider a .cosmo64 formatted point data file. 
 There are 36 Bytes of data per point.  We can display ~28,000,000 points and 
~35,500,000 points just fine.  At 45,500,000 points (=1.638 GBytes) the Generic 
Warning gets thrown.  This seems to be independent of the number of nodes and 
PEs used by the server, and whether the view RenderView or EyeDomeLighting.   
We also write/read .vtm format that behaves in a similar fashion: slices will 
display okay until the slice is thick enough that there are too many points in 
the image.

I have tried doing MPI on the client, which seems to have no effect on the 
foregoing limitation on the number of displayable points.  Please let me know 
if you can think of other switches that can be thrown, that may shed some more 
light on the issue, or if you are aware of a forthcoming repair.  Thanks very 
much, b.


===73
<>Bucky Kashiwa PhD, PE  <> Post: MS B216, Los Alamos, NM  87545   <>
<>  Ofc: TA3-SM123-RM276 <>Email: b...@lanl.gov<mailto:b...@lanl.gov>, 
kash...@qwest.net<mailto:kash...@qwest.net>  <>
<>Voice: 505-667-8812<>  Fax: 505-665-5926  <>
 <> Home: 505-988-7332  <> Cell: 505-795-5581  <>
===73

From: Andy Bauer <andy.ba...@kitware.com<mailto:andy.ba...@kitware.com>>
Date: Monday, December 19, 2016 at 4:17 PM
To: andrealphus <andrealp...@gmail.com<mailto:andrealp...@gmail.com>>
Cc: Bucky Kashiwa <b...@lanl.gov<mailto:b...@lanl.gov>>, 
"ParaView@ParaView.org<mailto:ParaView@ParaView.org>" 
<ParaView@paraview.org<mailto:ParaView@paraview.org>>
Subject: Re: [Paraview] ... not yet supported for more than 2147483647 bytes.

There are two parts to this issue. The first is that that vtkMPICommunicator 
for PV 4.3.1 won't communicate data that is over 2^31 bytes of data. This is 
fixed in PV 5.2. The other issue is due to MPI having a limit of 2^31 objects 
to be communicated in a single shot. This is MPI's API in that the count for 
objects that are typically sent/received is an int. See  
http://www.mpich.org/static/docs/v3.1/www3/MPI_Send.html for example.

On Mon, Dec 19, 2016 at 4:59 PM, andrealphus 
<andrealp...@gmail.com<mailto:andrealp...@gmail.com>> wrote:
That is a 32 bit error, from trying to index something with more than
(2^32)/2 elements or indices. Are you using any custom
libraries/packages/modules which might not be 64 bit compliant? Are
you sure you built a 64 bit version (check your gcc -v).

-ashton

On Mon, Dec 19, 2016 at 1:32 PM, Kashiwa, Bucky 
<b...@lanl.gov<mailto:b...@lanl.gov>> wrote:
> On Linux using ParaView version 4.3.1 built with OSMesa-9.0.1, OpenMPI,
> etc.  Running pvserver with 12 PEs, client-server mode, Standard Release
> ParaView-4.3.1-Linux-64bit client.  With large point data (>2Gb) we get
> this error, upon trying to display a large number of points:
>
> Generic Warning: In
> ../ParaView-v4.3.1-source/VTK/Parallel/MPI/vtkMPICommunicator.cxx,
> line 194
>
> This operation not yet supported for more than 2147483647 bytes.
>
> Our CMakeCache.txt is attached, in case it may provide helpful clues.
>
> Thanks much.  B. Kashiwa
>
>
>
> ___
> Powered by www.kitware.com<http://www.kitware.com>
>
> Visit other Kitware open-source projects at 
> http://www.kitware.com/opensource/opensource.html
>
> Please keep messages on-topic and check the ParaView Wiki at: 
> http://paraview.org/Wiki/ParaView
>
> Search the list archives at:

[Paraview] v5.4.0 and v5.4.1 client-server disconnect while splitting views

2017-12-13 Thread Kashiwa, Bucky
We are getting a very weird kind of client-server disconnect that seems to
be associated with crash that is triggered while splitting the view into
two or more parts.  (This happens on either client-server OS pairs of
mac-linux or linux-linux, in case that may matter.)  A backtrace from the
server window is given below.  Any suggestions are greatly appreciated.  b.



---
---
---
Client connected.


pvserver:14230 terminated with signal 11 at PC=2ab0fe68cf91
SP=7ffe6d91fc90.  Backtrace:
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingOpe
nGL2-pv5.4.so.1(_ZN16vtkShaderProgram11SetUniformiEPKci+0x21)[0x2ab0fe68cf9
1]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVVTKExtensi
onsRendering-pv5.4.so.1(_ZN20vtkIceTCompositePass27PushIceTDepthBufferToScr
eenEPK14vtkRenderState+0x288)[0x2ab0f5068038]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVClientServ
erCoreRendering-pv5.4.so.1(+0x164b28)[0x2ab0f2236b28]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingOpe
nGL2-pv5.4.so.1(_ZN27vtkDepthImageProcessingPass14RenderDelegateEPK14vtkRen
derStateP26vtkOpenGLFramebufferObjectP16vtkTextureObjectS6_+0x209)[0x2a
b0fe5be569]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingOpe
nGL2-pv5.4.so.1(_ZN13vtkEDLShading6RenderEPK14vtkRenderState+0x2e3)[0x2ab0f
e5d06c3]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingOpe
nGL2-pv5.4.so.1(_ZN17vtkOpenGLRenderer12DeviceRenderEv+0x5f)[0x2ab0fe63f3af
]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingCor
e-pv5.4.so.1(_ZN11vtkRenderer6RenderEv+0x823)[0x2ab0fea654d3]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingCor
e-pv5.4.so.1(_ZN21vtkRendererCollection6RenderEv+0xc5)[0x2ab0fea61fe5]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingCor
e-pv5.4.so.1(_ZN15vtkRenderWindow14DoStereoRenderEv+0xce)[0x2ab0fea6f2ce]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkRenderingCor
e-pv5.4.so.1(_ZN15vtkRenderWindow6RenderEv+0xf05)[0x2ab0fea715b5]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVClientServ
erCoreRendering-pv5.4.so.1(_ZN15vtkPVRenderView6RenderEbb+0x4e8)[0x2ab0f21f
9198]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVClientServ
erCoreRendering-pv5.4.so.1(_ZN15vtkPVRenderView11StillRenderEv+0x3c)[0x2ab0
f21f1b7c]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVServerMana
gerApplication-pv5.4.so.1(_Z22vtkPVRenderViewCommandP26vtkClientServerInter
preterP13vtkObjectBasePKcRK21vtkClientServerStreamRS5_Pv+0x19f8)[0x2ab0e7b3
06a8]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkClientServer
-pv5.4.so.1(_ZN26vtkClientServerInterpreter19CallCommandFunctionEPKcP13vtkO
bjectBaseS1_RK21vtkClientServerStreamRS4_+0x1a4)[0x2ab0e9a30234]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libEyeDomeLighting
View.so(_Z29vtkPVRenderViewWithEDLCommandP26vtkClientServerInterpreterP13vt
kObjectBasePKcRK21vtkClientServerStreamRS5_Pv+0x35b)[0x2ab11f01d43b]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkClientServer
-pv5.4.so.1(_ZN26vtkClientServerInterpreter19CallCommandFunctionEPKcP13vtkO
bjectBaseS1_RK21vtkClientServerStreamRS4_+0x1a4)[0x2ab0e9a30234]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkClientServer
-pv5.4.so.1(_ZN26vtkClientServerInterpreter20ProcessCommandInvokeERK21vtkCl
ientServerStreami+0x14a)[0x2ab0e9a3059a]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkClientServer
-pv5.4.so.1(_ZN26vtkClientServerInterpreter17ProcessOneMessageERK21vtkClien
tServerStreami+0x53e)[0x2ab0e9a3110e]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkClientServer
-pv5.4.so.1(_ZN26vtkClientServerInterpreter13ProcessStreamERK21vtkClientSer
verStream+0x1d)[0x2ab0e9a3137d]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVServerImpl
ementationCore-pv5.4.so.1(_ZN16vtkPVSessionCore21ExecuteStreamInternalERK21
vtkClientServerStreamb+0xf5)[0x2ab0e85c0d75]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVServerImpl
ementationCore-pv5.4.so.1(_ZN16vtkPVSessionCore13ExecuteStreamEjRK21vtkClie
ntServerStreamb+0x3b)[0x2ab0e85c0bab]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVServerImpl
ementationCore-pv5.4.so.1(_ZN16vtkPVSessionBase13ExecuteStreamEjRK21vtkClie
ntServerStreamb+0x35)[0x2ab0e85bf895]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkPVServerImpl
ementationCore-pv5.4.so.1(_ZN18vtkPVSessionServer24OnClientServerMessageRMI
EPvi+0x10f)[0x2ab0e85cc2af]
/yellow/usr/projects/pv_dev/PV-5.4.0-FX/lib/paraview-5.4/libvtkParallelCore
-pv5.4.so.1(_ZN25vtkMultiProcessController10ProcessRMIEiPvii+0x143)[0x2ab0e
b24c0e3]