Hi Alex,
In general, HDF5 files are very portable to many platforms and many
languages. Indeed, that is sort of the purpose behind the HDF Group.
While there are some incompatible edge cases, you sort of have to look for
them. Josh did a very good job of outlining the support for HDF5 across
th
Hi Alex,
Reading a PyTables file in another platform should be easy, as long as you
use a compression library that is supported on both platforms. The most
widely available is likely to be zlib, since it is included in the
pre-built binaries available from the HDF group's website. There are C an
Dear PyTables developers,
Thanks for the great project.
I would like to suggest to a small scientific community (Lagrangian
particle tracking velocimetry and numerical simulations of turbulent flows)
to start using PyTables as a common platform for exchanging of large
datasets (few gigas to tens
Hi Uwe,
Sorry, I wrote this when I was away from my computer and so I couldn't test
it.
Our documentation is clearly wrong then.However, what you *can* do is
take
the dtype from a known VideoNode table and then compare using this.
known_dtype = f.root.path_to_a_video_node.dtype
bar = filter(x
Hi Anthony,
On 05/19/2012 08:12 PM, Anthony Scopatz wrote:
> Hello Uwe,
>
> Why don't you try something like:
>
> bar = filter(x.description == VideoNode for x in f.walkNodes('/', 'Table'))
>
> or
>
> bar = filter(x.dtype == VideoNode._v_dtype for x in f.walkNodes('/',
> 'Table'))
>
> to compare t
Hi Anthony,
On 05/19/2012 08:12 PM, Anthony Scopatz wrote:
> Why don't you try something like:
>
> bar = filter(x.description == VideoNode for x in f.walkNodes('/', 'Table'))
>
> or
>
> bar = filter(x.dtype == VideoNode._v_dtype for x in f.walkNodes('/',
> 'Table'))
>
> to compare the dtype / desc