I have had some resembling challenges in my work, and here appending the nympy arrays to HDF5 files using PyTables has been the solution for me - that used in combination with lzo compression/decompression has lead to very high read/write performance in my application with low memory consumption. You may also want to have a look at the h5py package. Kim
2009/8/11 Juan Fiol <fi...@yahoo.com> > Hi, I am creating numpy arrays in chunks and I want to save the chunks > while my program creates them. I tried to use numpy.save but it failed > (because it is not intended to append data). I'd like to know what is, in > your opinion, the best way to go. I will put a few thousands every time but > building up a file of several Gbytes. I do not want to put into memory > all previous data each time. Also I cannot wait until the program finishes, > I must save partial results periodically. Thanks, any help will be > appreciated > Juan > > > > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion@scipy.org > http://mail.scipy.org/mailman/listinfo/numpy-discussion >
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion