Hi Marc,
Thanks Sorry the problem is when I try to read in ( var_a ) H5T_STD_I64BE (
not with  H5T_STD_U64BE as stated earlier)  I get numbers that donot match
the h5dump..However  I am able to read in (var_b) H5T_STD_U16BE  correctly
using( matches with h5dump)  H5T_NATIVE_INTEGER  in the hdfread.  Both the
variables( var_a and var_b)  that I read in is are arrays declared as
Integer in my fortran program. and compiled on Bign Endian m/c.  If I
 var_a as Integer*8 I get Error: There is no specific subroutine for the
generic 'h5dread_f'
Gompie


On Wed, Aug 5, 2015 at 12:24 PM, Miller, Mark C. <[email protected]> wrote:

> Have a look at these type definitions. . .
>
> https://www.hdfgroup.org/HDF5/doc/RM/PredefDTypes.html
>
> and this description of type conversion, in particular section 6.9
>
> https://www.hdfgroup.org/HDF5/doc/UG/UG_frame11Datatypes.html#Dtransfer
>
> I suspect the problem may have to do with signed vs. UNsigned types here.
>
> Also, regarding array dimension ordering, see this. . .
>
> https://www.hdfgroup.org/HDF5/doc/fortran/index.html#FortranUserNotes
>
> *Mark*
>
> From: Hdf-forum <[email protected]> on behalf of Richa
> Mathur <[email protected]>
> Reply-To: HDF Users Discussion List <[email protected]>
> Date: Tuesday, August 4, 2015 7:43 PM
>
> To: HDF Users Discussion List <[email protected]>
> Subject: Re: [Hdf-forum] h5dopen returns error
>
> Thanks again Marc, I now dont get core dump
>    For  H5T_STD_U16BE I am able to correctly read and the values that I
> read match with the h5dump output. However for  H5T_STD_U64BE
>  when I read using the H5T_NATIVE_INTEGER my output does not match with
> h5dump.
> I am using  hdf5-1.8.7.  If I use hdf5-1.8.13 it conflicts with the
> netcdf -gfortran reader that is also a part of my code.
>
> Is the array shape (12,22,26) indicated by the h5dump same as the one in
> fortran or it gets inverted (26,22,12) ?
> Gompie
>
> On Tue, Aug 4, 2015 at 9:58 PM, Miller, Mark C. <[email protected]> wrote:
>
>> So, HDF5 will handle all the 'conversion' from whatever type is used to
>> store the data in the file to whatever type you are using to store the data
>> in memory. So, for mem_type arg to h5dread_f, all you need to do is inform
>> HDF5 what the datatype is of the buffer (data_out) you are reading into.
>> You have defined data_out as INTEGER in your programming language. So, I
>> suspect the right thing to do is. . .
>>
>>           CALL h5dread_f(dset_id,H5T_NATIVE_INTEGER, data_out, data_dims,
>> error)
>>
>> That tells HDF5 that data_out is the caller's *native* integer type and
>> it will convert from whatever is in the file H5T_STD_I64BE (a 64 bit big
>> endianne integer) to that type automatically when it reads it.
>>
>> Mark
>>
>>
>>
>>
>> From: Hdf-forum <[email protected]> on behalf of
>> Richa Mathur <[email protected]>
>> Reply-To: HDF Users Discussion List <[email protected]>
>> Date: Tuesday, August 4, 2015 6:43 PM
>>
>> To: HDF Users Discussion List <[email protected]>
>> Subject: Re: [Hdf-forum] h5dopen returns error
>>
>> Hi Marc,
>> Thanks again. That link is helpful. I was able to read in the data
>> directly using the path in the h5read_f. I use
>>
>>           INTEGER, dimension(12,96):: data_out
>>           data_dims(1) = 12, data_dims(2) =96
>>           CALL h5dread_f(dset_id,H5T_STD_I64BE , data_out, data_dims,
>> error) as the h5dump indicates DATA_TYPE as  H5T_STF_I64BE.
>>
>> But it does not read properly ( gives core dump).  I compile on a x86_64
>> m/c gfortran.
>>
>> GROUP "All_Data" {
>>
>>       GROUP "AMS-SDR_All" {
>>>>          DATASET "BeamTime" {
>>>>             DATATYPE  H5T_STD_I64BE
>>>>
>>>> Gompie
>>
>>
>> On Tue, Aug 4, 2015 at 4:24 PM, Miller, Mark C. <[email protected]>
>> wrote:
>>
>>> Hmm. At this point, I think it may be more productive for everyone if
>>> you would review the manual,
>>> https://www.hdfgroup.org/HDF5/doc/RM/RM_H5Front.html, a bit more to be
>>> sure you are attempting to use the HDF5 interface correctly.
>>>
>>> In particular, your 2nd call to h5gopen_f is still specfying 'file_id'
>>> for first arg when it should be specifing 'grp_id1'. AMS-SDR_All is a group
>>> that 'lives in' the "All_Data" group (grp_id1), not the root group
>>> (file_id).
>>>
>>> Next, I am pretty certain you can just directly h5dopen_f
>>> "All_Data/AMS-SDR_All/BeamTime" and avoid all the business with the groups.
>>>
>>> Mark
>>>
>>> From: Hdf-forum <[email protected]> on behalf of
>>> Richa Mathur <[email protected]>
>>> Reply-To: HDF Users Discussion List <[email protected]>
>>> Date: Tuesday, August 4, 2015 1:06 PM
>>>
>>> To: HDF Users Discussion List <[email protected]>
>>> Subject: Re: [Hdf-forum] h5dopen returns error
>>>
>>> Thanks Mark, Christian , the problem was case sensitivety and the first
>>> level All_Data did not return any error.
>>> Now since the path to BeamTime is  All_Data/AMS-SDR_All/BeamTime  I
>>> gave
>>>
>>>  CALL h5gopen_f(file_id,"All_Data" , grp_id1, grp_hdferr)
>>> print *,"Open Group ID ID ERROR",grp_id1, grp_hdferr
>>>
>>>   CALL h5gopen_f(file_id,"AMS-SDR_All" , grp_id2, grp_hdferr) [should I
>>> give the grp_id1 instead of file_id ?]
>>> print *,"Open Group AMS-SDR ID ERROR",grp_id2, grp_hdferr ( here the
>>> hdferr is -1)
>>>
>>>   CALL h5dopen_f(grp_id1,"BeamTime", dset_id, error)
>>>
>>> Please help
>>>
>>> On Tue, Aug 4, 2015 at 3:53 PM, Christian Oyarzun <
>>> [email protected]> wrote:
>>>
>>>> It is case sensitive. it should be:
>>>>
>>>> CALL h5gopen_f(file_id,"All_Data" , grp_id1, grp_hdferr)
>>>>
>>>> not:
>>>>
>>>> CALL h5gopen_f(file_id,"All_data" , grp_id1, grp_hdferr)
>>>>
>>>> —Christian
>>>>
>>>> On Aug 4, 2015, at 3:45 PM, Richa Mathur <[email protected]>
>>>> wrote:
>>>>
>>>> Thanks Mark and Corey.
>>>>
>>>>    after  CALL h5fopen_f(filename, H5F_ACC_RDONLY_F, file_id, error)
>>>> I gave  CALL h5gopen_f(file_id,"All_data" , grp_id1, grp_hdferr) as the
>>>> first call to open the group and got grp_hdrerr as  -1.
>>>> I guess there is an error opening the group All_data.
>>>>
>>>>
>>>> the h5dump -n output is
>>>>
>>>> HDF5
>>>> "SAMS_npp_d20150728_t2359556_e0000273_b19431_c20150729063134831043_nasa_ops.h5"
>>>> {
>>>> FILE_CONTENTS {
>>>>  group      /
>>>>  group      /All_Data
>>>>  group      /All_Data/AMS-SDR_All
>>>>  dataset    /All_Data/AMS-SDR_All/BeamTime
>>>>  dataset    /All_Data/AMS-SDR_All/BrightnessTemperature
>>>>  dataset    /All_Data/AMS-SDR_All/BrightnessTemperatureFactors
>>>>
>>>> Any help....
>>>>
>>>>
>>>> On Tue, Aug 4, 2015 at 3:11 PM, Miller, Mark C. <[email protected]>
>>>> wrote:
>>>>
>>>>> yeah, agree. "BeamTime" is in group "All_Data". To get to BeamTime,
>>>>> you have to H5Gopen the group and use the hid_t you get from that group in
>>>>> place of 'file_id' arg in h5dopen. Don't forget to close the group (and
>>>>> dataset) after you're done.
>>>>>
>>>>> Mark
>>>>>
>>>>>
>>>>> From: Hdf-forum <[email protected]> on behalf of
>>>>> Richa Mathur <[email protected]>
>>>>> Reply-To: HDF Users Discussion List <[email protected]>
>>>>> Date: Tuesday, August 4, 2015 12:03 PM
>>>>> To: HDF Users Discussion List <[email protected]>
>>>>> Subject: Re: [Hdf-forum] h5dopen returns error
>>>>>
>>>>> Thanks Mark !!!
>>>>> Filename  is complete. The file_id that is returned from hdfopen_f
>>>>> (filename, H5F_ACC_RDONLY_F, file_id, error) is    16777217           and
>>>>> error code is 0.
>>>>>  How do I find if 'BeamTime'  is in root group or sub group ?
>>>>>
>>>>> The H5Dump is as below I guess the BeamTime is inside a GROUP
>>>>> "All_Data",
>>>>>
>>>>> GROUP "All_Data" {
>>>>>       GROUP "AMS-SDR_All" {
>>>>>          DATASET "BeamTime" {
>>>>>             DATATYPE  H5T_STD_I64BE
>>>>>             DATASPACE  SIMPLE { ( 12, 96 ) / ( H5S_UNLIMITED,
>>>>> H5S_UNLIMITED ) }
>>>>>             DATA {
>>>>>             (0,0): 1816819231684734, 1816819231702752,
>>>>> 1816819231720770,
>>>>>             (0,3): 1816819231738788, 1816819231756806, 1816819231774824
>>>>>
>>>>> Gompie
>>>>>
>>>>>
>>>>> On Tue, Aug 4, 2015 at 2:52 PM, Miller, Mark C. <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Are you sure you are getting a valid file_id back from the h5fopen_f
>>>>>> call? Are you by chance missing an extension on the filename such as
>>>>>> 'satfilename.h5' and is the dataset 'BeamTime' in the 'root group' of the
>>>>>> file or perhaps some sub-group?
>>>>>>
>>>>>> Mark
>>>>>>
>>>>>>
>>>>>> From: Hdf-forum <[email protected]> on behalf of
>>>>>> Richa Mathur <[email protected]>
>>>>>> Reply-To: HDF Users Discussion List <[email protected]>
>>>>>> Date: Tuesday, August 4, 2015 10:42 AM
>>>>>> To: "[email protected]" <[email protected]>
>>>>>> Subject: [Hdf-forum] h5dopen returns error
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi
>>>>>> I am trying to read a hdf5 file in fortran95 ( gfortran). Peice of
>>>>>> code is
>>>>>>
>>>>>>  INTEGER(HID_T) :: file_id       ! File identifier
>>>>>>   INTEGER(HID_T) :: dset_id       ! Dataset identifier
>>>>>>
>>>>>>   INTEGER     ::   error ! Error flag
>>>>>>   INTEGER     ::  i, j
>>>>>>
>>>>>>   INTEGER, DIMENSION(96,12) :: dset_data, data_out ! Data buffers
>>>>>>   INTEGER(HSIZE_T), DIMENSION(2) :: data_dims
>>>>>>
>>>>>> filename='satfilename'
>>>>>>
>>>>>>  CALL h5fopen_f(filename, H5F_ACC_RDONLY_F, file_id, error)
>>>>>>
>>>>>>   CALL h5dopen_f(file_id,"BeamTime", dset_id, error)
>>>>>>
>>>>>>
>>>>>> print *,"Data Set(ATMS) ID ERROR",dset_id, error .
>>>>>> The h5dopen_f returns error as  -1.
>>>>>>
>>>>>> Later on when I use
>>>>>>
>>>>>> CALL h5dread_f(dset_id, H5T_STD_I64BE, data_out, data_dims, error)
>>>>>>
>>>>>> error -1 is returned from this function as well and  I get incorrect
>>>>>> values of my variable in data_out.
>>>>>>
>>>>>> Can you help
>>>>>> Gompie
>>>>>>
>>>>>>
>>>>>>
>>>>>> The h5dump of my hdf5 file has
>>>>>>
>>>>>>
>>>>>>  GROUP "All_Data" {
>>>>>>       GROUP "AMS-SDR_All" {
>>>>>>          DATASET "BeamTime" {
>>>>>>             DATATYPE  H5T_STD_I64BE
>>>>>>             DATASPACE  SIMPLE { ( 12, 96 ) / ( H5S_UNLIMITED,
>>>>>> H5S_UNLIMITED ) }
>>>>>>             DATA {
>>>>>>             (0,0): 1816819231684734, 1816819231702752,
>>>>>> 1816819231720770,
>>>>>>             (0,3): 1816819231738788, 1816819231756806,
>>>>>> 1816819231774824
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> Hdf-forum is for HDF software users discussion.
>>>>>> [email protected]
>>>>>>
>>>>>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>>>>> Twitter: https://twitter.com/hdf5
>>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> Hdf-forum is for HDF software users discussion.
>>>>> [email protected]
>>>>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>>>> Twitter: https://twitter.com/hdf5
>>>>>
>>>>
>>>> _______________________________________________
>>>> Hdf-forum is for HDF software users discussion.
>>>> [email protected]
>>>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>>> Twitter: https://twitter.com/hdf5
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> Hdf-forum is for HDF software users discussion.
>>>> [email protected]
>>>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>>> Twitter: https://twitter.com/hdf5
>>>>
>>>
>>>
>>> _______________________________________________
>>> Hdf-forum is for HDF software users discussion.
>>> [email protected]
>>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>>> Twitter: https://twitter.com/hdf5
>>>
>>
>>
>> _______________________________________________
>> Hdf-forum is for HDF software users discussion.
>> [email protected]
>> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>> Twitter: https://twitter.com/hdf5
>>
>
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to