Hi Stefan,
A Dimarts 17 Abril 2007 00:11, Stefan Kuzminski escrigué:
> The user reported another file with this problem, but this manifestation
> raises the following stack trace. ( which is better than a core dump, I
> suppose.. )
>
> HDF5-DIAG: Error detected in HDF5 library version: 1.6.5 thread 0. Back
> trace foll\ ows.
> #000: H5Dio.c line 499 in H5Dread(): can't read data
> major(15): Dataset interface
> minor(24): Read failed
> #001: H5Dio.c line 756 in H5D_read(): can't read data
> major(15): Dataset interface
> minor(24): Read failed
> #002: H5Dio.c line 1786 in H5D_chunk_read(): data type conversion failed
> major(15): Dataset interface
> minor(29): Unable to initialize object
> #003: H5T.c line 4392 in H5T_convert(): data type conversion failed
> major(18): Attribute layer
> minor(50): Unable to encode value
> #004: H5Tconv.c line 2482 in H5T_conv_vlen(): incorrect length
> major(01): Function arguments
> minor(03): Inappropriate type
> Traceback (most recent call last):
> File "zz.py", line 9, in ?
> print fp.root._meta_.load.read()
> File "/home/srk/lib/python2.4/site-packages/tables/vlarray.py", line 578,
> in read listarr = self._readArray(start, stop, step)
> File "hdf5Extension.pyx", line 1202, in hdf5Extension.VLArray._readArray
> tables.exceptions.HDF5ExtError: VLArray._readArray: Problems reading the
> array data\ .
Mmmm, this looks like an HDF5 bug or that the file is corrupted (although this
last thing is not probable because it seems that you can consistently
generate files that reproduce the problem).
I've tried with both HDF5 1.6.5 (which consistently causes a segfault) and a
recent version of HDF5 1.8 series (1.8.0-of20070404) which is a bit more
explicit:
HDF5-DIAG: Error detected in HDF5 (1.8.0-of20070404) thread 0:
#000: H5Dio.c line 526 in H5Dread(): can't read data
major: Dataset
minor: Read failed
#001: H5Dio.c line 744 in H5D_read(): can't read data
major: Dataset
minor: Read failed
#002: H5Dio.c line 1708 in H5D_chunk_read(): data type conversion failed
major: Dataset
minor: Unable to initialize object
#003: H5T.c line 4612 in H5T_convert(): data type conversion failed
major: Attribute
minor: Unable to encode value
#004: H5Tconv.c line 2855 in H5T_conv_vlen(): can't read VL data
major: Datatype
minor: Read failed
#005: H5Tvlen.c line 835 in H5T_vlen_disk_read(): Unable to read VL
information
major: Datatype
minor: Read failed
#006: H5HG.c line 1111 in H5HG_read(): unable to load heap
major: Heap
minor: Unable to load metadata into cache
#007: H5AC.c line 1935 in H5AC_protect(): H5C_protect() failed.
major: Object cache
minor: Unable to protect metadata
#008: H5C.c line 5476 in H5C_protect(): can't load entry
major: Object cache
minor: Unable to load metadata into cache
#009: H5C.c line 9401 in H5C_load_entry(): unable to load entry
major: Object cache
minor: Unable to load metadata into cache
#010: H5HG.c line 350 in H5HG_load(): unable to read global heap collection
major: Heap
minor: Read failed
#011: H5F.c line 2986 in H5F_block_read(): file read failed
major: Low-level I/O
minor: Read failed
#012: H5FD.c line 3348 in H5FD_read(): driver read request failed
major: Virtual File Layer
minor: Read failed
#013: H5FDsec2.c line 725 in H5FD_sec2_read(): addr overflow
major: Invalid arguments to routine
minor: Address overflowed
The last error in the HDF5 stack seems to suggest that there is a problem on
the HDF5 side in reading the file.
The offending dataset is '/_meta_/variables' (a VLString). Stefan, are you
able to generate a simple HDF5 file with preferably only this dataset that
can reproduce the problem? If so, can you send me the PyTables code, please?
If you can't, I'll send the file this to the HDF5 crew so as to see if they
can throw more light on this.
As an aside, here is what the h5ls tool (from the HDF5 1.8.0-of20070404
version) is saying about it:
$ h5ls -rv TRANSACTIONS248.h5/_meta_/variables
Opened "TRANSACTIONS248.h5" with sec2 driver.
/_meta_/variables Dataset {1/Inf}
Attribute: FLAVOR scalar
Type: 7-byte null-terminated ASCII string
Data: "Object"
Attribute: CLASS scalar
Type: 8-byte null-terminated ASCII string
Data: "VLARRAY"
Attribute: TITLE scalar
Type: 1-byte null-terminated ASCII string
Data: ""
Attribute: VERSION scalar
Type: 4-byte null-terminated ASCII string
Data: "1.2"
Location: 1:2781094
Links: 1
Modified: 2007-04-13 00:43:43 CEST
Chunks: {1024} 8192 bytes
Storage: 8 logical bytes, 16384 allocated bytes, 0.05% utilization
Type: variable length of
native unsigned char
which seems fine. But when trying to read the content:
$ h5ls -rd TRANSACTIONS248.h5/_meta_/variables
/_meta_/variables Dataset {1/Inf}
Data:
Unable to print data.
However, using h5ls from HDF5 1.6.5 gives the segfault:
$ h5ls -rd TRANSACTIONS248.h5/_meta_/variables
/_meta_/variables Dataset {1/Inf}
Data:
Segmentation fault
Please try to see if you find a simple way of reproducing a broken file and
tell me about your findings.
Thanks,
--
>0,0< Francesc Altet http://www.carabos.com/
V V Cárabos Coop. V. Enjoy Data
"-"
-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
Pytables-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/pytables-users