Thanks David, I'll look into it now.
Regarding the allocation/deallocation times I think that is not an issue for 
me. The chunks are generated by a fortran routine that takes several minutes to 
run (I am collecting a few thousand points before saving to disk). They are 
approximately the same size but not exactly. I want them to be stored for later 
retrieval and analysis in a convenient way.
Thanks, regards


-----------------------------------------
You'll probably want the EArray. createEArray() on a new h5file, then  
append to it.

http://www.pytables.org/docs/manual/ch04.html#EArrayMethodsDescr

If your chunks are always the same size it might be best to try and do  
your work in-place and not allocate a new NumPy array each time. In  
theory 'del' ing the object when you're done with it should work but  
the garbage collector may not act quickly enough for your liking/the  
allocation step may start slowing you down.

What do I mean? Well, you could clear the array when you're done with  
it using foo[:] = 0 (or nan, or whatever) and when you're "building it  
up" use the inplace augmented assignment operators as much as possible  
(+=, /=, -=, *=, %=, etc.).

David




      
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to