Hi Anthony,
> Oh OK, I think I understand a little better. What I would do would be to make
> "for i,file in enumerate(hdf5_files)" the outer most loop and then use the
> File.walkNodes() method [1] to walk each file and pick out only the data sets
> that you want to copy, skipping over all oth
Hi Andre,
Oh OK, I think I understand a little better. What I would do would be to
make "for i,file in enumerate(hdf5_files)" the outer most loop and then use
the File.walkNodes() method [1] to walk each file and pick out only the
data sets that you want to copy, skipping over all others. This sh
Hi Anthony,
> I am a little confused. Let me verify. You have 400 hdf5 file (re and im)
> buried in an a unix directory tree. You want to make a single file which
> concatenates this data. Is this right?
Sorry for my description - that is not quite right.
The "unix directory tree" is the gr
Hi Andre,
I am a little confused. Let me verify. You have 400 hdf5 file (re and
im) buried in an a unix directory tree. You want to make a single file
which concatenates this data. Is this right?
Be Well
Anthony
On Wed, Aug 15, 2012 at 6:52 PM, Andre' Walker-Loud wrote:
> Hi All,
>
> Just a
Hi All,
Just a strategy question.
I have many hdf5 files containing data for different measurements of the same
quantities.
My directory tree looks like
top description [ group ]
sub description [ group ]
avg [ group ]
re [ numpy array shape = (96,1,2) ]
im [ numpy array shape