Hi all:

I wanna add that we had a similar issue with h5py. It used to crash when we 
write certain things/strings. Nothing reasonable fixed it! Not even upgrading 
to Windows 10. Eventually we formatted the machine and that fixed the problem. 
I would say that the problem had something to do with shared library linking. 
Because it also created some other problems with PyQt. I don't know if this is 
related really, but when I read this, I was betting it's Windows 7, and I won 
the bet. 

Cheers, 
Samer Afach 

On March 6, 2017 9:56:32 PM GMT+01:00, Barbara Jones <[email protected]> 
wrote:
>Hello Ruxandra,
>
>I saw in the stackoverflow issue that you referenced, that the user had
>contacted hdfgroup's support. So, I went through old helpdesk issues
>and found the issue from the user.  We never actually determined what
>his issue was, but we did determine that the problem was specific to
>his machine.
>
>I had sent him an HDF5 C example program that created a 100+ GB file.
>When he compiled and ran the program on his machine, it too failed.
>However, he could run the program successfully on another windows
>machine.
>
>I still have the example program, and  attached it to this message in
>case you might find it helpful (h5cmpstr.c).
>
>-Barbara
>[email protected]
>
>From: Hdf-forum [mailto:[email protected]] On Behalf
>Of Ruxandra Marasescu
>Sent: Tuesday, February 28, 2017 9:40 AM
>To: [email protected]
>Subject: [Hdf-forum] H5FD_sec2_write failed to write error EINVAL
>
>Hello,
>
>We have a similar issue with the one mentioned here:
>http://stackoverflow.com/questions/32077579/exception-appending-dataframe-chunk-with-string-values-to-large-hdf5-file-using
>
>Was there a solution to it please?
>5FD_sec2_write
>file write failed: time = Tue Aug 18 18:26:17 2015
>, filename = 'large_file.h5', file descriptor = 4, errno = 22, error
>message = 'Invalid argument', buf = 0000000066A40018, total write size
>= 262095, bytes this sub-write = 262095, bytes actually written =
>18446744073709551615, offset = 47615949533
>
>
>The error we are having is:
>#000: C:\dev\hdf5-1.8.17\src\H5Dio.c line 271 in H5Dwrite(): can't
>prepare for writing data
>    major: Dataset
>    minor: Write failed
>#001: C:\dev\hdf5-1.8.17\src\H5Dio.c line 347 in H5D__pre_write():
>can't write chunk directly
>    major: Dataset
>    minor: Write failed
>#002: C:\dev\hdf5-1.8.17\src\H5Dchunk.c line 392 in
>H5D__chunk_direct_write(): unable to write raw data to file
>    major: Dataset
>    minor: Write failed
>#003: C:\dev\hdf5-1.8.17\src\H5Fio.c line 171 in H5F_block_write():
>write through metadata accumulator failed
>    major: Low-level I/O
>    minor: Write failed
>#004: C:\dev\hdf5-1.8.17\src\H5Faccum.c line 825 in H5F__accum_write():
>file write failed
>    major: Low-level I/O
>    minor: Write failed
>#005: C:\dev\hdf5-1.8.17\src\H5FDint.c line 260 in H5FD_write(): driver
>write request failed
>    major: Virtual File Layer
>    minor: Write failed
>#006: C:\dev\hdf5-1.8.17\src\H5FDsec2.c line 802 in H5FD_sec2_write():
>file write failed: time = Tue Feb 28 10:00:02 2017
>, filename = 'C:\hdf_file.fast5', file descriptor = 4, errno = 22,
>error message = 'Invalid argument', buf = 00000000C4F6D950, total write
>size = 41681, bytes this sub-write = 41681, bytes actually written =
>18446744073709551615, offset = 142297119656
>    major: Low-level I/O
>    minor: Write failed
>
>Thank you,
>Ruxandra

-- 
Sent from my Android device. 
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to