Hi Werner,
Yeah, the different in the number of parameters is curious. And no, all
the hdf5-blosc plugin versions take 7 params (4 reserved, hence the 4 0s)
and 3 for the actual compression parameters (complevel, shuffle, codec), so
I don't quite understand why 8 (with 5 0s) works for me, but it is indeed
the required number on Unix boxes (using HDF5 1.8.x there, so maybe the
difference is just because HDF5 1.8.x and 1.10.x).
I have just tried with 7 params on Windows, but I am having the same
problem again:
λ h5repack -v -f UD=32001,7,0,0,0,0,4,2,1 carray1-zlib.h5 carray1-repack.h5
Objects to modify layout are...
Objects to apply filter are...
User Defined 32001
Making file <carray1-repack.h5>...
-----------------------------------------
Type Filter (Compression) Name
-----------------------------------------
group /
HDF5-DIAG: Error detected in HDF5 (1.10.1) thread 0:
#000:
C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Pocpl.c line
1037 in H5Pget_filter_by_id2(): can't find object for ID
major: Object atom
minor: Unable to find atom information (already closed?)
#001:
C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Pint.c line
3815 in H5P_object_verify(): property list is not a member of the class
major: Property lists
minor: Unable to register new atom
#002:
C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Pint.c line
3765 in H5P_isa_class(): not a property list
major: Invalid arguments to routine
minor: Inappropriate type
warning: could not create dataset </carray>. Applying original settings
dset /carray
<warning: could not apply the filter to /carray>
Anyway, for the record, I ended using ptrepack for converting HDF5 files
into Blosc-compressed ones:
λ ptrepack --complib blosc:lz4 --complevel=5 --shuffle 1 carray1-zlib.h5
carray1-repack.h5 -o
λ ptdump carray1-repack.h5
/ (RootGroup) ''
/carray (CArray(200, 300), shuffle, blosc:lz4(5)) ''
Anyways, thanks for all the help,
Francesc
2017-07-07 16:18 GMT+02:00 Werner Benger <[email protected]>:
> Hi Francesc,
>
> I got it to run with 1.10.1 and GCC 7.1 under MSYS2, kind of, but the
> 8-parameter makes it to hang forever, it only works with giving 7
> parameters on h5repack for me:
>
> /bin/arch-x86_64-w64-mingw32-OptDeb/h5repack -v -f UD=32001,7,0,0,0,0,4,2,1
> myRGBA.f5 myRGBA-repack.f5
> Objects to modify layout are...
> Objects to apply filter are...
> User Defined 32001
> Making file <myRGBA-repack.f5>...
> -----------------------------------------
> Type Filter (Compression) Name
> ...
>
> dset UD (1.582:1)
> /t=000000000.0000000000/RGBA/Points/StandardCartesianChart3D/Height/Fragment[0x0x0]
>
> ...
>
>
> The file looks all fine:
>
>
> ./bin/arch-x86_64-w64-mingw32-OptDeb/h5ls.exe -rv
> myRGBA-repack.f5/t=000000000.0000000000/RGBA/Points/
> StandardCartesianChart3D/ReturnNumber/Fragment[0x0x0]
> Opened "myRGBA-repack.f5" with sec2 driver.
> t=000000000.0000000000/RGBA/Points/StandardCartesianChart3D/ReturnNumber/Fragment[0x0x0]
> Dataset {1/1, 256/256, 256/256}
> Data: Contiguous
> Location: 1:4283257
> Links: 1
> Chunks: {1, 256, 256} 65536 bytes
> Storage: 65536 logical bytes, 21807 allocated bytes, 300.53% utilization
> Filter-0: blosc-32001 {2, 2, 1, 65536, 4, 2, 1}
> Type: native unsigned char
>
>
> whereas the setting with 8 parameters hangs forever:
>
> ./bin/arch-x86_64-w64-mingw32-OptDeb/h5repack -v -f
> UD=32001,8,0,0,0,0,0,4,2,1 myRGBA.f5 myRGBA-repack.f5
>
>
> If you want to try my configuration, try this in MSYS2:
>
> svn co https://subversion.assembla.com/svn/FiberHDF5/hdf5; cd hdf5;
> mingw32-make -j12
>
>
> Maybe I am using an older version of blosc that doesn't like 8 parameters?
>
> Werner
>
>
> On 04.07.2017 15:50, Werner Benger wrote:
>
> Hi,
>
> I have not tried HDF 1.10.1 yet with Mingw, but wanted to give it a try
> in the next days anyway and can let you know how it goes. I'm not using
> cmake, but my own makefiles, I think you have access to it anyway, so for
> now with the 1.8.17 version. I can let you know once I got to check with
> 1.10.1 .
>
> Werner
>
> On 04.07.2017 14:35, Francesc Alted wrote:
>
> Hi,
>
> Today I spent some more time on this issue, and I recompiled everything
> (the Blosc library and its plugin) against HDF5 1.10.1 (the binaries from
> hdfgroup.org) in 64 bit mode. Again, I can read Blosc-compressed HDF5
> files with h5ls and h5dump (so I am pretty sure that the plugin is
> correctly compiled), but h5repack is still failing:
>
> """
> C:\tmp
>
> λ h5repack -v -f UD=32001,8,0,0,0,0,0,9,1,2 carray1.h5 carray1-repack.h5
>
> Objects to modify layout are...
>
> Objects to apply filter are...
>
> User Defined 32001
>
> Making file <carray1-repack.h5>...
>
> -----------------------------------------
>
> Type Filter (Compression) Name
>
> -----------------------------------------
>
> group /
>
> HDF5-DIAG: Error detected in HDF5 (1.10.1) thread 0:
>
> #000: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Pocpl.c
> line 1037 in H5Pget_filter_by_id2(): can't
> find object for ID
>
> major: Object atom
>
> minor: Unable to find atom information (already closed?)
>
> #001: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Pint.c
> line 3815 in H5P_object_verify(): property
> list is not a member of the class
>
> major: Property lists
>
> minor: Unable to register new atom
>
> #002: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Pint.c
> line 3765 in H5P_isa_class(): not a propert
> y list
>
> major: Invalid arguments to routine
>
> minor: Inappropriate type
>
> warning: could not create dataset </carray>. Applying original settings
>
> dset /carray
>
> <warning: could not apply the filter to /carray>
>
>
>
> """
>
> At this point, I don't know what to do. I am using VS2017 Community
> Edition; could it be possible that I need VS2015 for compiling the plugin?
>
> Also, I have tried to use MingW64 to compile the plugin, but cmake has
> difficulties in finding the HDF5 installation:
>
> """
>
> faltet@FRANCESCALT476E MINGW64 /c/Users/faltet/hdf5-blosc/build
>
> $ HDF5_ROOT="/c/Program Files/HDF_Group/HDF5/1.10.1" cmake ..
>
> System is unknown to cmake, create:
>
> Platform/MINGW64_NT-10.0 to use this system, please send your config file
> to cma [email protected] so it can be added to cmake
>
> Your CMakeCache.txt file was copied to CopyOfCMakeCache.txt. Please send
> that fi le to [email protected].
>
> BLOSC_PREFIX='/c/Users/faltet/hdf5-blosc/build/blosc'
>
> BLOSC_INSTALL_DIR='/c/Users/faltet/hdf5-blosc/build/blosc'
>
> BLOSC_CMAKE_ARGS='-DCMAKE_INSTALL_PREFIX=/c/Users/
> faltet/hdf5-blosc/build/blosc'
>
> GIT_EXECUTABLE='/usr/bin/git.exe'
>
> CMake Error at /usr/share/cmake-3.6.2/Modules/
> FindPackageHandleStandardArgs.cmak e:148 (message):
>
> Could NOT find HDF5 (missing: HDF5_LIBRARIES HDF5_HL_LIBRARIES) (found
>
> version "1.10.1")
>
> Call Stack (most recent call first):
>
> /usr/share/cmake-3.6.2/Modules/FindPackageHandleStandardArgs.cmake:388
> (_FPHSA _FAILURE_MESSAGE)
>
> /usr/share/cmake-3.6.2/Modules/FindHDF5.cmake:820
> (find_package_handle_standar d_args)
>
> CMakeLists.txt:48 (find_package)
>
>
>
> -- Configuring incomplete, errors occurred!
>
> See also "/c/Users/faltet/hdf5-blosc/build/CMakeFiles/CMakeOutput.log".
> """
>
> Thanks!
>
>
> 2017-06-28 22:25 GMT+02:00 Werner Benger <[email protected]>:
>
>> Francesc, I am getting this message, which is correct since I don't have
>> it compiled with zstd.
>>
>> h5repack -f UD=32001,8,0,0,0,0,0,9,1,5 myRGBA.f5 myRGBA-out.f5
>> HDF5-DIAG: Error detected in HDF5 (1.8.17) thread 10592:
>> #000: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5D.c line
>> 426 in H5Dclose(): can't decrement count on dataset ID
>> major: Dataset
>> minor: Unable to decrement reference count
>> #001: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5I.c line
>> 1549 in H5I_dec_app_ref_always_close(): can't decrement ID ref count
>> major: Object atom
>> minor: Unable to decrement reference count
>> #002: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5I.c line
>> 1491 in H5I_dec_app_ref(): can't decrement ID ref count
>> major: Object atom
>> minor: Unable to decrement reference count
>> #003: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dint.c
>> line 1544 in H5D_close(): unable to destroy chunk cache
>> major: Dataset
>> minor: Unable to release object
>> #004: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dchunk.c
>> line 5164 in H5D__chunk_dest(): unable to flush one or more raw data chunks
>> major: Low-level I/O
>> minor: Unable to flush data from cache
>> #005: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dchunk.c
>> line 2641 in H5D__chunk_cache_evict(): cannot flush indexed storage buffer
>> major: Low-level I/O
>> minor: Write failed
>> #006: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dchunk.c
>> line 2528 in H5D__chunk_flush_entry(): output pipeline failed
>> major: Data filters
>> minor: Filter operation failed
>> #007: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Z.c line
>> 1412 in H5Z_pipeline(): filter returned failure
>> major: Data filters
>> minor: Write failed
>> #008: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/h5zblosc/blosc_filter.c
>> line 190 in blosc_filter(): this Blosc library does not have support for the
>> 'zstd' compressor, but only for: blosclz,lz4,lz4hc,zlib
>> major: Data filters
>> minor: Callback failed
>> #009: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dint.c
>> line 1507 in H5D_close(): unable to flush cached dataset info
>> major: Dataset
>> minor: Write failed
>> #010: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dint.c
>> line 2438 in H5D__flush_real(): unable to flush raw data
>> major: Dataset
>> minor: Unable to flush data from cache
>> #011: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dchunk.c
>> line 2112 in H5D__chunk_flush(): unable to flush one or more raw data chunks
>> major: Dataset
>> minor: Unable to flush data from cache
>> #012: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Dchunk.c
>> line 2528 in H5D__chunk_flush_entry(): output pipeline failed
>> major: Data filters
>> minor: Filter operation failed
>> #013: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/hdf5-1.8.17/H5Z.c line
>> 1412 in H5Z_pipeline(): filter returned failure
>> major: Data filters
>> minor: Write failed
>> #014: C:/msys64/home/Wrk/vish/fish/fiber/FiberHDF5/h5zblosc/blosc_filter.c
>> line 190 in blosc_filter(): this Blosc library does not have support for the
>> 'zstd' compressor, but only for: blosclz,lz4,lz4hc,zlib
>> major: Data filters
>> minor: Callback failed
>> h5repack error: <myRGBA.f5>: Could not copy data to: myRGBA-out.f5
>>
>> But when using LZ4HC as filter it works fine:
>>
>> Wrk $ h5repack -f UD=32001,8,0,0,0,0,0,9,1,2 myRGBA.f5 myRGBA-out.f5
>>
>> Wrk $ ll
>>
>> -rw-r--r-- 1 Wrk None 12M Jan 25 16:43 myRGBA.f5
>>
>> -rw-r--r-- 1 Wrk None 5.0M Jun 28 22:18 myRGBA-out.f5
>>
>> I'm using 1.8.17, evidently.
>>
>> Btw, the restriction with the lib-prefix has the advantage that the HDF5
>> plugin loader does not look at DLL's such as "blosc.dll", it would not even
>> try to load it as a plugin. Just something better than "lib" would probably
>> be preferable, and independent of platform and compiler. I guess in your
>> version it even tries to load the blosc.dll itself as a plugin - which
>> could lead to issues if such a plugin would contain some startup code like
>> global c++ constructors or code in DllMain which does "something". That's
>> unlikely to be related to your problem though, I don't see from the output
>> what the problem could be, it works for me.
>>
>> Werner
>>
>>
>> On 28.06.2017 19:01, Francesc Alted wrote:
>>
>> Werner, I think the lib- prefix does not apply to my case because I am
>> using VS2015 for compiling the libraries. But agreed that HDF5 taking
>> different conventions depending on whether a MSVC or GCC-based is a bit
>> confusing.
>>
>> BTW, although the h5dump can read blosc files with the plugin, ptrepack
>> cannot create them on Windows:
>>
>> C:\Users\faltet>h5repack -f UD=32001,8,0,0,0,0,0,9,1,5 carray1-blosc.h5
>> carray1-repack.h5
>> HDF5-DIAG: Error detected in HDF5 (1.8.19) thread 0:
>> #000:
>> C:\Users\buildbot\Buildbot\hdf518-StdRelease-code-vs14\build\hdfsrc\src\H5Pocpl.c
>> line 1100 in H5Pget_filter_by_id2(): can't find object for ID
>> major: Object atom
>> minor: Unable to find atom information (already closed?)
>> #001:
>> C:\Users\buildbot\Buildbot\hdf518-StdRelease-code-vs14\build\hdfsrc\src\H5Pint.c
>> line 3379 in H5P_object_verify(): property list is not a member of the class
>> major: Property lists
>> minor: Unable to register new atom
>> #002:
>> C:\Users\buildbot\Buildbot\hdf518-StdRelease-code-vs14\build\hdfsrc\src\H5Pint.c
>> line 3329 in H5P_isa_class(): not a property list
>> major: Invalid arguments to routine
>> minor: Inappropriate type
>> HDF5-DIAG: Error detected in HDF5 (1.8.19) thread 0:
>> #000:
>> C:\Users\buildbot\Buildbot\hdf518-StdRelease-code-vs14\build\hdfsrc\src\H5Pocpl.c
>> line 1100 in H5Pget_filter_by_id2(): can't find object for ID
>> major: Object atom
>> minor: Unable to find atom information (already closed?)
>> #001:
>> C:\Users\buildbot\Buildbot\hdf518-StdRelease-code-vs14\build\hdfsrc\src\H5Pint.c
>> line 3379 in H5P_object_verify(): property list is not a member of the class
>> major: Property lists
>> minor: Unable to register new atom
>> #002:
>> C:\Users\buildbot\Buildbot\hdf518-StdRelease-code-vs14\build\hdfsrc\src\H5Pint.c
>> line 3329 in H5P_isa_class(): not a property list
>> major: Invalid arguments to routine
>> minor: Inappropriate type
>> h5repack error: <carray1-blosc.h5>: Could not copy data to:
>> carray1-repack.h5
>>
>> I have tested with both 1.10.1 and 1.8.19 (above).
>>
>> whereas it works perfectly well on a Mac box:
>>
>> Francescs-MacBook-Pro:PyTables faltet$ h5repack -f
>> UD=32001,8,0,0,0,0,0,9,1,5 carray1-blosc.h5 carray1-repack.h5
>> Francescs-MacBook-Pro:PyTables faltet$ h5ls -v carray1-repack.h5
>> Opened "carray1-repack.h5" with sec2 driver.
>> carray Dataset {200/218, 300/300}
>> Location: 1:800
>> Links: 1
>> Chunks: {218, 300} 65400 bytes
>> Storage: 60000 logical bytes, 58 allocated bytes, 103448.28%
>> utilization
>> Filter-0: blosc-32001 {2, 2, 1, 65400, 9, 1, 5, 5}
>> Type: native unsigned char
>>
>> Any hints?
>>
>> 2017-06-28 15:15 GMT+02:00 Werner Benger <[email protected]>:
>>
>>> Hi Francesc,
>>>
>>> it took me a while to figure out this detail, too, I had the blosc.dll
>>> in the PATH at first, and it seems to work under some circumstances but not
>>> others... not entirely clear to me when it needs to be in the PATH or in
>>> the calling's DLL's directory outside of the PATH.
>>>
>>> There's some code in H5PL.c that checks for a lib prefix:
>>>
>>> #ifndef __CYGWIN__
>>> if(!HDstrncmp(dp->d_name, "lib", (size_t)3) &&
>>> (HDstrstr(dp->d_name, ".so") || HDstrstr(dp->d_name,
>>> ".dylib"))) {
>>> #else
>>> ...
>>>
>>> #endif
>>>
>>> So I guess in your case that prefix check did not apply, whereas for me
>>> using gcc under windows it did. Thus it seems best for compatibility to
>>> keep the plugins named with lib- starting under Windows, once compiled that
>>> C code should not matter whether it was produced by GCC or MSC. Otherwise
>>> yes, the name does not matter indeed.
>>>
>>> Werner
>>>
>>> On 28.06.2017 15:00, Francesc Alted wrote:
>>>
>>> Hi Werner,
>>>
>>> Right, what I was missing was basically moving the blosc.dll library
>>> into the HDF5 plugin directory. I thought that putting blosc.dll in a
>>> directory in the %PATH% was going to be enough, but apparently this is not
>>> the case (I suppose this is a Windows trickery).
>>>
>>> For what is worth, the name of the DLL plugin is not that important as I
>>> have tried with libHDF5blosc.dll, libH5Zblosc.dll, H5Zblosc.dll and even
>>> blosc_plugin.dll and all of them work. Apparently the HDF5 library tries
>>> all the DLLs in the plugin directory and uses the one that registers the
>>> necessary filter ID; pretty smart.
>>>
>>> Thanks so much!
>>>
>>> 2017-06-28 14:31 GMT+02:00 Werner Benger <[email protected]>:
>>>
>>>> Just to add:
>>>>
>>>> under windows, the plugins need to start with lib as prefix. It's
>>>> technically not required, but the HDF5 plugin loader expects that. So the
>>>> liblz4.dll in my plugins dir is the LZ4 plugin,the libHDF5blosc.dll the
>>>> blosc-filter; the blosc.dll is used by the blosc filter and needs to be in
>>>> the same directory if compiled as dynamic library as well.
>>>>
>>>> Werner
>>>>
>>>> On 28.06.2017 14:24, Werner Benger wrote:
>>>>
>>>> Hi,
>>>>
>>>> I am using HDF5 with blosc dynamic plugin loading regularly under
>>>> Windows, and this are the files as needed in the plugin directory that
>>>> HDF5 looks for:
>>>>
>>>> HDF5Plugins $ ll
>>>> total 1.2M
>>>> -rwxr-xr-x 1 Wrk None 382K Jun 26 23:30 blosc.dll*
>>>> -rwxr-xr-x 1 Wrk None 330K Jun 26 23:30 libHDF5blosc.dll*
>>>> -rwxr-xr-x 1 Wrk None 410K Jun 26 23:30 liblz4.dll*
>>>>
>>>> The blosc.dll is the blosc library, libHDF5blosc.dll is the HDF5 plugin
>>>> using that library.
>>>>
>>>> I'm using Mingw64, which is gcc 6.3 under MSYS2 .
>>>>
>>>> Werner
>>>>
>>>> On 28.06.2017 13:44, Francesc Alted wrote:
>>>>
>>>> Hi,
>>>>
>>>> I have been trying to give support to HDF5 on Windows for dynamically
>>>> loading the Blosc plugin (https://github.com/Blosc/hdf5-blosc), but no
>>>> success so far.
>>>>
>>>> I basically have compiled the plugin with:
>>>>
>>>> > cl /I"C:\Program Files (x86)\blosc\include"
>>>> /I"C:\HDF_Group\HDF5\1.8.12\include" libH5Zblosc.c blosc_filter.c
>>>> /OUT:libH5Zblosc.dll /LD /link /LIBPATH:"C:\Program Files (x86)\blosc\lib"
>>>> /LIBPATH:"C:\HDF_Group\HDF5\1.8.12\lib" blosc.lib hdf5.lib
>>>>
>>>> and then copied the libH5Zblosc.dll to
>>>> "%ALLUSERSPROFILE%/hdf5/lib/plugin",
>>>> but when I try to use a h5dump for getting the data of an HDF5 with some
>>>> datasets compressed with Blosc, I am getting:
>>>>
>>>> >h5dump \tmp\carray1-blosc.h5
>>>> HDF5 "\tmp\carray1-blosc.h5" {
>>>> GROUP "/" {
>>>> DATASET "carray" {
>>>> DATATYPE H5T_STD_U8LE
>>>> DATASPACE SIMPLE { ( 200, 300 ) / ( 218, 300 ) }
>>>> DATA {h5dump error: unable to print data
>>>>
>>>> }
>>>> }
>>>> }
>>>> }
>>>>
>>>> When asking for more errors, I am getting a more informative message:
>>>>
>>>> >h5dump --enable-error-stack \tmp\carray1-blosc.h5
>>>> HDF5 "\tmp\carray1-blosc.h5" {
>>>> GROUP "/" {
>>>> DATASET "carray" {
>>>> DATATYPE H5T_STD_U8LE
>>>> DATASPACE SIMPLE { ( 200, 300 ) / ( 218, 300 ) }
>>>> DATA {HDF5-DIAG: Error detected in HDF5 (1.10.1) thread 0:
>>>> #000: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Dio.c
>>>> line 171 in H5Dread(): can't read data
>>>> major: Dataset
>>>> minor: Read failed
>>>> #001: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Dio.c
>>>> line 544 in H5D__read(): can't read data
>>>> major: Dataset
>>>> minor: Read failed
>>>> #002:
>>>> C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Dchunk.c
>>>> line 2050 in H5D__chunk_read(): unable to read raw data chunk
>>>> major: Low-level I/O
>>>> minor: Read failed
>>>> #003:
>>>> C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Dchunk.c
>>>> line 3405 in H5D__chunk_lock(): data pipeline read failed
>>>> major: Data filters
>>>> minor: Filter operation failed
>>>> #004: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5Z.c
>>>> line 1349 in H5Z_pipeline(): required filter 'blosc' is not registered
>>>> major: Data filters
>>>> minor: Read failed
>>>> #005: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5PL.c
>>>> line 389 in H5PL_load(): search in paths failed
>>>> major: Plugin for dynamically loaded library
>>>> minor: Can't get value
>>>> #006: C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\src\H5PL.c
>>>> line 812 in H5PL__find(): can't open directory
>>>> major: Plugin for dynamically loaded library
>>>> minor: Can't open directory or file
>>>> h5dump error: unable to print data
>>>>
>>>> }
>>>> }
>>>> }
>>>> }
>>>> H5tools-DIAG: Error detected in HDF5:tools (1.10.1) thread 0:
>>>> #000:
>>>> C:\autotest\hdf5110-StdRelease-code-10vs14\build\hdfsrc\tools\lib\h5tools_dump.c
>>>> line 1639 in h5tools_dump_simple_dset(): H5Dread failed
>>>> major: Failure in tools library
>>>> minor: error in function
>>>>
>>>> I suppose the meaningful part is here:
>>>>
>>>> 10vs14\build\hdfsrc\src\H5PL.c line 812 in H5PL__find(): can't open
>>>> directory
>>>>
>>>> b
>>>> ut I have no clues on 1) what directory the dynamic plugin machinery
>>>> in HDF5 is looking at and 2) which name the DLL should have (as per
>>>> manual, I am using libH5Zblosc.dll, but I have tried with H5Zblosc.dll and
>>>> blosc.dll with no success).
>>>>
>>>> I have also tried to use the HDF5_PLUGIN_PATH environment variable, so
>>>> as to direct the plugin machinery in HDF5 to look into the specific
>>>> %ALLUSERSPROFILE%/hdf5/lib/plugin
>>>> place,
>>>> but no luck.
>>>>
>>>> I tried with HDF5 1.8.12 and 1.10.1, with same results. Finally, if
>>>> it is of any help, I am using Windows 10 64 bit.
>>>>
>>>> As a suggestion, it may help to debug situations like this if the HDF5
>>>> backtrace of the plugin machinery would provide info about 1) which
>>>> directory it is looking at for finding the HDF5 plugin and 2) the name of
>>>> the DLL that it is trying to load.
>>>>
>>>> Thanks in advance,
>>>>
>>>> --
>>>> Francesc Alted
>>>>
>>>>
>>>> _______________________________________________
>>>> Hdf-forum is for HDF software users
>>>> [email protected]http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>>> Twitter: https://twitter.com/hdf5
>>>>
>>>>
>>>> --
>>>> ___________________________________________________________________________
>>>> Dr. Werner Benger Visualization Research
>>>> Center for Computation & Technology at Louisiana State University (CCT/LSU)
>>>> 2019 Digital Media Center, Baton Rouge, Louisiana 70803
>>>> Tel.: +1 225 578 4809 <%28225%29%20578-4809> Fax.:
>>>> +1 225 578-5362 <%28225%29%20578-5362>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> Hdf-forum is for HDF software users
>>>> [email protected]http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>>> Twitter: https://twitter.com/hdf5
>>>>
>>>>
>>>> --
>>>> ___________________________________________________________________________
>>>> Dr. Werner Benger Visualization Research
>>>> Center for Computation & Technology at Louisiana State University (CCT/LSU)
>>>> 2019 Digital Media Center, Baton Rouge, Louisiana 70803
>>>> Tel.: +1 225 578 4809 <%28225%29%20578-4809> Fax.:
>>>> +1 225 578-5362 <%28225%29%20578-5362>
>>>>
>>>>
>>>> _______________________________________________
>>>> Hdf-forum is for HDF software users discussion.
>>>> [email protected]
>>>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>>> Twitter: https://twitter.com/hdf5
>>>>
>>>
>>>
>>>
>>> --
>>> Francesc Alted
>>>
>>>
>>> _______________________________________________
>>> Hdf-forum is for HDF software users
>>> [email protected]http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>> Twitter: https://twitter.com/hdf5
>>>
>>>
>>> --
>>> ___________________________________________________________________________
>>> Dr. Werner Benger Visualization Research
>>> Center for Computation & Technology at Louisiana State University (CCT/LSU)
>>> 2019 Digital Media Center, Baton Rouge, Louisiana 70803
>>> Tel.: +1 225 578 4809 <%28225%29%20578-4809> Fax.:
>>> +1 225 578-5362 <%28225%29%20578-5362>
>>>
>>>
>>> _______________________________________________
>>> Hdf-forum is for HDF software users discussion.
>>> [email protected]
>>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>> Twitter: https://twitter.com/hdf5
>>>
>>
>>
>>
>> --
>> Francesc Alted
>>
>>
>> _______________________________________________
>> Hdf-forum is for HDF software users
>> [email protected]http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>> Twitter: https://twitter.com/hdf5
>>
>>
>> --
>> ___________________________________________________________________________
>> Dr. Werner Benger Visualization Research
>> Center for Computation & Technology at Louisiana State University (CCT/LSU)
>> 2019 Digital Media Center, Baton Rouge, Louisiana 70803
>> Tel.: +1 225 578 4809 <%28225%29%20578-4809> Fax.: +1
>> 225 578-5362 <%28225%29%20578-5362>
>>
>>
>> _______________________________________________
>> Hdf-forum is for HDF software users discussion.
>> [email protected]
>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>> Twitter: https://twitter.com/hdf5
>>
>
>
>
> --
> Francesc Alted
>
>
> _______________________________________________
> Hdf-forum is for HDF software users
> [email protected]http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
>
> --
> ___________________________________________________________________________
> Dr. Werner Benger Visualization Research
> Center for Computation & Technology at Louisiana State University (CCT/LSU)
> 2019 Digital Media Center, Baton Rouge, Louisiana 70803
> Tel.: +1 225 578 4809 <(225)%20578-4809> Fax.: +1 225
> 578-5362 <(225)%20578-5362>
>
>
>
> _______________________________________________
> Hdf-forum is for HDF software users
> [email protected]http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
>
> --
> ___________________________________________________________________________
> Dr. Werner Benger Visualization Research
> Center for Computation & Technology at Louisiana State University (CCT/LSU)
> 2019 Digital Media Center, Baton Rouge, Louisiana 70803
> Tel.: +1 225 578 4809 <(225)%20578-4809> Fax.: +1 225
> 578-5362 <(225)%20578-5362>
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
--
Francesc Alted
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5