Dear All,

Apologies for cross-posting! If anyone has come across below issue, please
let me know.

I was trying to write compound datatype (containing region reference) in
parallel using collective operation. With h5dump I saw that the region
references were not created correctly. I looked into "Collective Calling
Requirements" document and then realised that writing region reference is
not supported in parallel i/o. I tried simple test example which gives an
error "*H5Dio.c line 664 in H5D__write(): Parallel IO does not support
writing region reference datatypes yet*".

But note that this error does't appear if I am writing a compound datatype
containing region reference. Is this allowed/possible? (i.e. first create
region reference and store it as the member of compound datatype and then
write compound datatype in parallel collectively).

The dataset that I am creating is for neuron cells which are vary diverse
in terms of sizes. Each mpi rank is writing variable-length data to dataset
and I thought region-reference would be helpful but now have the issue
while writing region references.

What are possible alternatives? I can think of:
- write <count,offset> pair to emulate region reference? (but lose
flexibility as other users are going to use tools/python libraries to read
datasets and in this case they have to use count+offset pair)
- another option could be to write datasets first and then single rank can
create and write region references. (each cell has thousands of
region_references and for large simulations this may not scale (?) )

Any other suggestions?

Thanks,
Pramod
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to