Hello Ruxandra,
I saw in the stackoverflow issue that you referenced, that the user had
contacted hdfgroup's support. So, I went through old helpdesk issues and found
the issue from the user. We never actually determined what his issue was, but
we did determine that the problem was specific to his machine.
I had sent him an HDF5 C example program that created a 100+ GB file. When he
compiled and ran the program on his machine, it too failed. However, he could
run the program successfully on another windows machine.
I still have the example program, and attached it to this message in case you
might find it helpful (h5cmpstr.c).
-Barbara
[email protected]
From: Hdf-forum [mailto:[email protected]] On Behalf Of
Ruxandra Marasescu
Sent: Tuesday, February 28, 2017 9:40 AM
To: [email protected]
Subject: [Hdf-forum] H5FD_sec2_write failed to write error EINVAL
Hello,
We have a similar issue with the one mentioned here:
http://stackoverflow.com/questions/32077579/exception-appending-dataframe-chunk-with-string-values-to-large-hdf5-file-using
Was there a solution to it please?
5FD_sec2_write
file write failed: time = Tue Aug 18 18:26:17 2015
, filename = 'large_file.h5', file descriptor = 4, errno = 22, error message =
'Invalid argument', buf = 0000000066A40018, total write size = 262095, bytes
this sub-write = 262095, bytes actually written = 18446744073709551615, offset
= 47615949533
The error we are having is:
#000: C:\dev\hdf5-1.8.17\src\H5Dio.c line 271 in H5Dwrite(): can't prepare for
writing data
major: Dataset
minor: Write failed
#001: C:\dev\hdf5-1.8.17\src\H5Dio.c line 347 in H5D__pre_write(): can't
write chunk directly
major: Dataset
minor: Write failed
#002: C:\dev\hdf5-1.8.17\src\H5Dchunk.c line 392 in
H5D__chunk_direct_write(): unable to write raw data to file
major: Dataset
minor: Write failed
#003: C:\dev\hdf5-1.8.17\src\H5Fio.c line 171 in H5F_block_write(): write
through metadata accumulator failed
major: Low-level I/O
minor: Write failed
#004: C:\dev\hdf5-1.8.17\src\H5Faccum.c line 825 in H5F__accum_write(): file
write failed
major: Low-level I/O
minor: Write failed
#005: C:\dev\hdf5-1.8.17\src\H5FDint.c line 260 in H5FD_write(): driver write
request failed
major: Virtual File Layer
minor: Write failed
#006: C:\dev\hdf5-1.8.17\src\H5FDsec2.c line 802 in H5FD_sec2_write(): file
write failed: time = Tue Feb 28 10:00:02 2017
, filename = 'C:\hdf_file.fast5', file descriptor = 4, errno = 22, error
message = 'Invalid argument', buf = 00000000C4F6D950, total write size = 41681,
bytes this sub-write = 41681, bytes actually written = 18446744073709551615,
offset = 142297119656
major: Low-level I/O
minor: Write failed
Thank you,
Ruxandra
/*
h5cmpstr.c - String in Compound Datatype
*/
#include <string.h>
#include "hdf5.h"
#define FILENAME "cstr.h5"
#define DATASETNAME "Compound String"
#define LENGTH 9000000
#define RANK 1
typedef struct s1_t {
char astr[10];
char bstr[13];
} s1_t;
s1_t s1[LENGTH];
hid_t s1_tid;
int
main(void)
{
hid_t file, dataset, space, atype, btype, s1_tid;
hid_t pid, memspace;
herr_t status;
hsize_t dim[]= {LENGTH}, maxdims[]={H5S_UNLIMITED};
hsize_t chkdims[1]={1000000};
hssize_t offset[1]={0};
hsize_t count[1]={LENGTH};
size_t size;
hsize_t newsize[1]={LENGTH};
long long int i,j;
for (i=0; i< LENGTH; i++)
{
strcpy (s1[i].astr, "Astronomy ");
s1[i].astr[9]='\0';
strcpy (s1[i].bstr, "Biochemistry ");
s1[i].bstr[12]='\0';
}
/*
* Create the file.
*/
file = H5Fcreate(FILENAME, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
printf ("H5Fcreate: %i\n", file);
/*
* Create the memory data type.
*/
atype = H5Tcopy (H5T_C_S1);
size = 10;
status = H5Tset_size (atype, size);
btype = H5Tcopy (H5T_C_S1);
size = 13;
status = H5Tset_size (btype, size);
s1_tid = H5Tcreate (H5T_COMPOUND, sizeof(s1_t));
printf ("H5Tcreate: %i\n", s1_tid);
status = H5Tinsert(s1_tid, "a_string", HOFFSET(s1_t, astr), atype);
printf ("H5Tinsert (a_string): %i\n", status);
status = H5Tinsert(s1_tid, "b_string", HOFFSET(s1_t, bstr), btype);
printf ("H5Tinsert (b_string): %i\n", status);
/*
* Create the data space.
*/
space = H5Screate_simple (RANK, dim, maxdims);
printf ("H5Screate_simple: %i\n", space);
/*
* Set chunking so can create appendable array
*/
pid = H5Pcreate (H5P_DATASET_CREATE);
status = H5Pset_chunk (pid, RANK, chkdims);
/*
* Create the dataset.
*/
dataset = H5Dcreate(file, DATASETNAME, s1_tid, space, H5P_DEFAULT, pid,
H5P_DEFAULT);
printf ("H5Dcreate: %i\n", dataset);
status = H5Sclose (space);
status = H5Pclose (pid);
/*
* Wtite data to the dataset;
*/
status = H5Dwrite(dataset, s1_tid, H5S_ALL, H5S_ALL, H5P_DEFAULT, s1);
printf ("H5Dwrite: %i\n", status);
memspace = H5Screate_simple (RANK, dim, NULL);
printf ("H5Screate_simple: %i\n", memspace);
for (j=0;j<500;j++) {
offset[0] = newsize[0];
newsize[0]=newsize[0]+LENGTH;
status = H5Dextend (dataset, newsize);
printf ("H5Dextend: %i\n", status);
space = H5Dget_space (dataset);
printf ("H5Dget_space: %i\n", space);
status = H5Sselect_hyperslab (space, H5S_SELECT_SET, offset, NULL,
count, NULL);
printf ("H5Sselect_hyperslab: %i\n", status);
status = H5Dwrite(dataset, s1_tid, memspace, space, H5P_DEFAULT, s1);
printf ("H5Dwrite: %i\n", status);
space = H5Sclose (space);
printf ("H5Sclose returns: %i\n", space);
}
/*
* Release resources
*/
status = H5Sclose (memspace);
printf ("H5Sclose (memspace): %i\n", status);
status = H5Tclose(atype);
printf ("H5Tclose (atype): %i\n", status);
status = H5Tclose(btype);
printf ("H5Tclose (btype): %i\n", status);
status = H5Tclose(s1_tid);
printf ("H5Tclose (s1_tid): %i\n", status);
status = H5Dclose(dataset);
printf ("H5Dclose (dataset): %i\n", status);
status = H5Fclose(file);
printf ("H5Fclose (file): %i\n", status);
}
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5