David Worrall (el 2007-08-15 a les 08:40:28 +1000) va dir:: > Thanks Francesc, > > >>> tables.print_versions() > -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- > =-=-=-= > PyTables version: 2.0 > HDF5 version: 1.6.5 > NumPy version: 1.0.3 > Zlib version: 1.2.3 > BZIP2 version: 1.0.2 (30-Dec-2001) > Python version: 2.4.3 (#1, Mar 30 2006, 11:02:16) > [GCC 4.0.1 (Apple Computer, Inc. build 5250)] > Platform: darwin-i386 > Byte-ordering: little > -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- > =-=-=-= > >>> > My RAM is 256Mb on OSX 10.4.10 (the latest). > > It may be something to do with the way the OS manages memory. I've > heard it's pretty aggressive, so if the number of groups is large, > parts may be cached out before being accessed again. I don't know > what I'm talking about really :-) but if you think that it is likely, > perhaps I could raise the issue on the pythonmac-SIG. ?
Umm, we happen to have a very similar computer available, I'll try to
reproduce the failure on it and investigate further. ::
> Also, is there any reason why I can't/shouldn't use 2 or more .h5
> files in parallel (split the 3500+ top-level groups into two DBs)?
Opening several HDF5 files under PyTables simultaneously should be
completely OK, but mind you there is no such thing as "mounting" or
"transparent merging" of the files. However, writing a small script to
"zip" the files together later shouldn't be difficult. If you
encountered memory problems you could try to set the NODE_MAX_SLOTS
parameter to the value which usually works for you divided between the
number of files accessed in parallel.
HTH,
::
Ivan Vilata i Balaguer >qo< http://www.carabos.com/
Cárabos Coop. V. V V Enjoy Data
""
signature.asc
Description: Digital signature
------------------------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Still grepping through log files to find problems? Stop. Now Search log events and configuration files using AJAX and a browser. Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________ Pytables-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/pytables-users
