Hello,

To mark:

I haven't checked the resource leakage. Actually, I don't know how to do
that. Could you please give me a reference or explain to me how to do that?
>From your message what I understood is that using the same handle in a loop
can cause that problem?
I'm sending you my codes as attached files:
-lc_helper.f90 is where I have the subroutines
- light_cones.f90 is the principal code that calls the subroutines.
- Makefile

Thank you,


To  Scot,

I commented everything except the opening and close statements and I didn't
get the crash. Before doing that the code crashed after the 1021 loop.  Do
you think that the only way to solve this issue is avoiding opening and
closing file sin a loop? I am doing that due to the way the data is given
to me.

Guido

2017-02-28 0:17 GMT+08:00 <[email protected]>:

> Send Hdf-forum mailing list submissions to
>         [email protected]
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_
> lists.hdfgroup.org
>
> or, via email, send a message with subject or body 'help' to
>         [email protected]
>
> You can reach the person managing the list at
>         [email protected]
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Hdf-forum digest..."
>
>
> Today's Topics:
>
>    1. Re: hdf5 crashing when opening and closing files inside a
>       loop (Koennecke Mark (PSI))
>    2. Re: hdf5 crashing when opening and closing files inside a
>       loop (Scot Breitenfeld)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 27 Feb 2017 15:39:14 +0000
> From: "Koennecke Mark (PSI)" <[email protected]>
> To: HDF Users Discussion List <[email protected]>
> Subject: Re: [Hdf-forum] hdf5 crashing when opening and closing files
>         inside a loop
> Message-ID: <[email protected]>
> Content-Type: text/plain; charset="utf-8"
>
> Guido,
>
> Am 26.02.2017 um 17:11 schrieb Guido granda mu?oz <[email protected]
> <mailto:[email protected]>>:
>
> Hello,
> I am new in hdf5, and I recently encountered a problem. I am using fortran
> 90 and a fortran 90 hdf5 wrapper (https://github.com/galtay/
> sphray/tree/master/hdf5_wrapper) to read and write from many hdf5 files.
> The process of reading, modifying data and writing is done inside a do loop.
> I found that the code runs fine until a certain point when It shows the
> error message:
>
>  ***Abort HDF5 : Unable to open HDF5 file in open_file()!
>
>  file name  : /mnt/su3ctm/ggranda/cp_test2/rep_-1_-1_-2/ivol31/galaxies.
> hdf5
>
>
> I have checked out and the file exist, so that is not the problem. I think
> is something connected with hdf5.
>
>
> The code is the following:
>
>
>     subroutine replications_translation(n_a,nsub,lbox,directory)
>     ! subroutine to do the translations along all the the replications
>     ! it makes use of the
>     ! n_a: number of replications per axis
>     ! nsub: number of subvolumes
>     ! lbox: box size
>     ! x: x coordinate
>     ! y: y coordinate
>     ! z: z coordinate
>     ! directory: folder that contains the replications
>     ! redshift: character that specifies the redshift (e.g. iz200)
>         integer, intent(in) :: n_a,nsub
>         real, dimension(:),allocatable :: x,y,z,dc,decl,ra
>         real, intent(in) :: lbox
>         character(*),  intent(in) :: directory
>         !character(5),  intent(in) ::redshift
>         character(2) :: temp_i,temp_j,temp_k,temp_subv
>         integer :: i,j,k,l,ifile,dims(1),rank,count_l
>         count_l=0
>         do i=-n_a,n_a
>             write(temp_i,"(I2)") i
>             do j=-n_a,n_a
>                 write(temp_j,"(I2)") j
>                 do k=-n_a,n_a
>                     write(temp_k,"(I2)") k
>                     do l=30,nsub-1
>
>                         write(temp_subv,"(I2)") l
>                         call hdf5_open_file(ifile,directory//'rep_'//trim(
> adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(
> adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/
> galaxies.hdf5',readonly=.false.)
>
>                         call hdf5_get_dimensions(ifile,'
> Output001/mhalo',rank,dims)
>                         allocate(x(dims(1)),y(dims(1))
> ,z(dims(1)),dc(dims(1)),ra(dims(1)),decl(dims(1)))
>                         call hdf5_read_data(ifile,'Output001/xgal',x)
>                         call hdf5_read_data(ifile,'Output001/ygal',y)
>                         call hdf5_read_data(ifile,'Output001/zgal',z)
>                         x   =x+i*lbox
>                         y   =y+j*lbox
>                         z   =z+k*lbox
>                         dc  =sqrt(x**2.0+y**2.0+z**2.0)
>                         decl=asin(z/dc)
>                         ra  =atan2(y,x)
>
>                         call hdf5_write_data(ifile,'
> Output001/xgal_t',x,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/xgal_t/Comment','X(lightcone) coordinate of this galaxy
> [Mpc/h]')
>
>                         call hdf5_write_data(ifile,'
> Output001/ygal_t',y,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/ygal_t/Comment','Y(lightcone) coordinate of this galaxy
> [Mpc/h]')
>
>                         call hdf5_write_data(ifile,'
> Output001/zgal_t',z,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/zgal_t/Comment','Z(lightcone) coordinate of this galaxy
> [Mpc/h]')
>
>                         call hdf5_write_data(ifile,'
> Output001/dc',dc,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/dc/Comment','Comoving distance  [Mpc/h]')
>                         !print *, "check hdf5"
>                         call hdf5_write_data(ifile,'
> Output001/ra',ra,overwrite=.true.)
>                         call 
> hdf5_write_attribute(ifile,'Output001/ra/Comment',"Right
> ascention")
>
>                         call hdf5_write_data(ifile,'Output001/decl',decl,
> overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/decl/Comment',"Declination")
>
>                         call hdf5_close_file(ifile)
>                         print *, "Done with "//directory//'rep_'//trim(
> adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(
> adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/galaxies.hdf5'
>                         deallocate(x,y,z,dc,ra,decl)
>                         count_l=count_l+1
>                         print *, "number =",count_l
>                     enddo
>                enddo
>             enddo
>         enddo
>
>
> Could you please help me with that? The number of files that I need to
> open, read, write is tremendous. So, I dont know if that is a limitation
> for hdf5 or if my code is written with not good practices and that is
> causing the crash.
>
> Thanks in advance,
>
>
>
> Have you checked for resource leakage? HDF5 is special in that way that a
> close of a file does not necessarily release all handles
> associated with the file. So, if you leave a handle to a dataset,
> attribute, data type, etc dangling for each file, after some time
> space is exhausted and HDF5 will crash on you in a weird way. Been there,
> done that.
>
> As you have not provided all your subroutines I cannot see if this is the
> case with your code.
>
> Regards,
>
>      Mark Koennecke
>
>
> --
> Guido
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]<mailto:[email protected]>
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.hdfgroup.org/pipermail/hdf-forum_lists.
> hdfgroup.org/attachments/20170227/b5af7491/attachment-0001.html>
>
> ------------------------------
>
> Message: 2
> Date: Mon, 27 Feb 2017 16:16:48 +0000
> From: Scot Breitenfeld <[email protected]>
> To: HDF Users Discussion List <[email protected]>
> Subject: Re: [Hdf-forum] hdf5 crashing when opening and closing files
>         inside a loop
> Message-ID: <[email protected]>
> Content-Type: text/plain; charset="utf-8"
>
> Out of curiosity, do you have the same issue if you comment out everything
> except the open and close routine?
>
> Also, I?m not 100% sure why you are choosing this file layout/looping
> scheme. It seems that you are creating a directory structure for different
> files/datasets. Can you move this directory/file structure to inside the
> HDF5 file using groups? That way you can eliminate this file open/close in
> the middle of the nested loop. This is not just an HDF5 issue either; I
> don?t think you would want to do it this way for POSIX writes either if you
> want to get good I/O performance.
>
> Scot
>
>
> On Feb 26, 2017, at 10:11 AM, Guido granda mu?oz <[email protected]
> <mailto:[email protected]>> wrote:
>
> Hello,
> I am new in hdf5, and I recently encountered a problem. I am using fortran
> 90 and a fortran 90 hdf5 wrapper (https://github.com/galtay/
> sphray/tree/master/hdf5_wrapper) to read and write from many hdf5 files.
> The process of reading, modifying data and writing is done inside a do loop.
> I found that the code runs fine until a certain point when It shows the
> error message:
>
>  ***Abort HDF5 : Unable to open HDF5 file in open_file()!
>
>  file name  : /mnt/su3ctm/ggranda/cp_test2/rep_-1_-1_-2/ivol31/galaxies.
> hdf5
>
>
> I have checked out and the file exist, so that is not the problem. I think
> is something connected with hdf5.
>
>
> The code is the following:
>
>
>     subroutine replications_translation(n_a,nsub,lbox,directory)
>     ! subroutine to do the translations along all the the replications
>     ! it makes use of the
>     ! n_a: number of replications per axis
>     ! nsub: number of subvolumes
>     ! lbox: box size
>     ! x: x coordinate
>     ! y: y coordinate
>     ! z: z coordinate
>     ! directory: folder that contains the replications
>     ! redshift: character that specifies the redshift (e.g. iz200)
>         integer, intent(in) :: n_a,nsub
>         real, dimension(:),allocatable :: x,y,z,dc,decl,ra
>         real, intent(in) :: lbox
>         character(*),  intent(in) :: directory
>         !character(5),  intent(in) ::redshift
>         character(2) :: temp_i,temp_j,temp_k,temp_subv
>         integer :: i,j,k,l,ifile,dims(1),rank,count_l
>         count_l=0
>         do i=-n_a,n_a
>             write(temp_i,"(I2)") i
>             do j=-n_a,n_a
>                 write(temp_j,"(I2)") j
>                 do k=-n_a,n_a
>                     write(temp_k,"(I2)") k
>                     do l=30,nsub-1
>
>                         write(temp_subv,"(I2)") l
>                         call hdf5_open_file(ifile,directory//'rep_'//trim(
> adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(
> adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/
> galaxies.hdf5',readonly=.false.)
>
>                         call hdf5_get_dimensions(ifile,'
> Output001/mhalo',rank,dims)
>                         allocate(x(dims(1)),y(dims(1))
> ,z(dims(1)),dc(dims(1)),ra(dims(1)),decl(dims(1)))
>                         call hdf5_read_data(ifile,'Output001/xgal',x)
>                         call hdf5_read_data(ifile,'Output001/ygal',y)
>                         call hdf5_read_data(ifile,'Output001/zgal',z)
>                         x   =x+i*lbox
>                         y   =y+j*lbox
>                         z   =z+k*lbox
>                         dc  =sqrt(x**2.0+y**2.0+z**2.0)
>                         decl=asin(z/dc)
>                         ra  =atan2(y,x)
>
>                         call hdf5_write_data(ifile,'
> Output001/xgal_t',x,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/xgal_t/Comment','X(lightcone) coordinate of this galaxy
> [Mpc/h]')
>
>                         call hdf5_write_data(ifile,'
> Output001/ygal_t',y,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/ygal_t/Comment','Y(lightcone) coordinate of this galaxy
> [Mpc/h]')
>
>                         call hdf5_write_data(ifile,'
> Output001/zgal_t',z,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/zgal_t/Comment','Z(lightcone) coordinate of this galaxy
> [Mpc/h]')
>
>                         call hdf5_write_data(ifile,'
> Output001/dc',dc,overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/dc/Comment','Comoving distance  [Mpc/h]')
>                         !print *, "check hdf5"
>                         call hdf5_write_data(ifile,'
> Output001/ra',ra,overwrite=.true.)
>                         call 
> hdf5_write_attribute(ifile,'Output001/ra/Comment',"Right
> ascention")
>
>                         call hdf5_write_data(ifile,'Output001/decl',decl,
> overwrite=.true.)
>                         call hdf5_write_attribute(ifile,'
> Output001/decl/Comment',"Declination")
>
>                         call hdf5_close_file(ifile)
>                         print *, "Done with "//directory//'rep_'//trim(
> adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(
> adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/galaxies.hdf5'
>                         deallocate(x,y,z,dc,ra,decl)
>                         count_l=count_l+1
>                         print *, "number =",count_l
>                     enddo
>                enddo
>             enddo
>         enddo
>
>
> Could you please help me with that? The number of files that I need to
> open, read, write is tremendous. So, I dont know if that is a limitation
> for hdf5 or if my code is written with not good practices and that is
> causing the crash.
>
> Thanks in advance,
>
> --
> Guido
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]<mailto:[email protected]>
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.hdfgroup.org/pipermail/hdf-forum_lists.
> hdfgroup.org/attachments/20170227/ea6149be/attachment.html>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [email protected]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
>
>
> ------------------------------
>
> End of Hdf-forum Digest, Vol 92, Issue 25
> *****************************************
>



-- 
Guido
!@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
!This module is aimed to help the light_cones.f90 code. It contains
!modules and functions used in that code. 
!@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
module lc_helper
use hdf5_wrapper
implicit none
   contains
   subroutine replications(na,direct1,direct2)
   !subroutine to construct the replications to construct the lightcone
   !na: number of replication per axis
   !direct1: directory where all the subvolumes are located
   !direct2: drectory where all the replications will be stored

       integer, intent(in) :: na
       character(*), intent(in) :: direct1,direct2
       character(2):: temp_i,temp_j,temp_k
       integer :: i,j,k
       
       do i=-na,na
           write(temp_i,"(I2)") i 
           do j=-na,na
               write(temp_j,"(I2)") j
               do k=-na,na
                   write(temp_k,"(I2)") k
                   print *, "replicating: i= ",i,"j= ",j,"k= ",k
                   call system('mkdir -p '//direct2) 
!                   print *, "direct1 =",direct1
!                   print *, "direct2 =",direct2//'_'//trim(adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(adjustl(temp_k)) 
                   call system('cp -r --no-target-directory '//direct1//' '//direct2//'rep'//'_'//trim(adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(adjustl(temp_k)))
             
               enddo
           enddo    
       enddo
       
    end subroutine replications
    subroutine replications_translation(n_a,nsub,lbox,directory)
    ! subroutine to do the translations along all the the replications
    ! it makes use of the 
    ! n_a: number of replications per axis
    ! nsub: number of subvolumes
    ! lbox: box size
    ! x: x coordinate
    ! y: y coordinate
    ! z: z coordinate
    ! directory: folder that contains the replications
    ! redshift: character that specifies the redshift (e.g. iz200)
        integer, intent(in) :: n_a,nsub
        real, dimension(:),allocatable :: x,y,z,dc,decl,ra
        real, intent(in) :: lbox
        character(*),  intent(in) :: directory
        !character(5),  intent(in) ::redshift
        character(2) :: temp_i,temp_j,temp_k,temp_subv
        integer :: i,j,k,l,ifile,dims(1),rank,count_l
        count_l=0 
        do i=-n_a,n_a
            write(temp_i,"(I2)") i
            do j=-n_a,n_a
                write(temp_j,"(I2)") j
                do k=-n_a,n_a
                    write(temp_k,"(I2)") k
                    do l=30,nsub-1
                   
                        write(temp_subv,"(I2)") l
                        call hdf5_open_file(ifile,directory//'rep_'//trim(adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/galaxies.hdf5',readonly=.false.)

                        call hdf5_get_dimensions(ifile,'Output001/mhalo',rank,dims)
                        allocate(x(dims(1)),y(dims(1)),z(dims(1)),dc(dims(1)),ra(dims(1)),decl(dims(1)))
                        call hdf5_read_data(ifile,'Output001/xgal',x)
                        call hdf5_read_data(ifile,'Output001/ygal',y)
                        call hdf5_read_data(ifile,'Output001/zgal',z)
                        x   =x+i*lbox
                        y   =y+j*lbox
                        z   =z+k*lbox
                        dc  =sqrt(x**2.0+y**2.0+z**2.0)
                        decl=asin(z/dc)
                        ra  =atan2(y,x)

                        call hdf5_write_data(ifile,'Output001/xgal_t',x,overwrite=.true.)
                        call hdf5_write_attribute(ifile,'Output001/xgal_t/Comment','X(lightcone) coordinate of this galaxy [Mpc/h]')
                        
                        call hdf5_write_data(ifile,'Output001/ygal_t',y,overwrite=.true.)
                        call hdf5_write_attribute(ifile,'Output001/ygal_t/Comment','Y(lightcone) coordinate of this galaxy [Mpc/h]')
 
                        call hdf5_write_data(ifile,'Output001/zgal_t',z,overwrite=.true.)
                        call hdf5_write_attribute(ifile,'Output001/zgal_t/Comment','Z(lightcone) coordinate of this galaxy [Mpc/h]')
 
                        call hdf5_write_data(ifile,'Output001/dc',dc,overwrite=.true.)
                        call hdf5_write_attribute(ifile,'Output001/dc/Comment','Comoving distance  [Mpc/h]')
                        !print *, "check hdf5"
                        call hdf5_write_data(ifile,'Output001/ra',ra,overwrite=.true.)
                        call hdf5_write_attribute(ifile,'Output001/ra/Comment',"Right ascention")                        
 
                        call hdf5_write_data(ifile,'Output001/decl',decl,overwrite=.true.)
                        call hdf5_write_attribute(ifile,'Output001/decl/Comment',"Declination")                        

                        call hdf5_close_file(ifile)
                        print *, "Done with "//directory//'rep_'//trim(adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/galaxies.hdf5'
                        deallocate(x,y,z,dc,ra,decl)
                        count_l=count_l+1
                        print *, "number =",count_l
                    enddo
               enddo
            enddo       
        enddo
    end subroutine replications_translation   
    subroutine light_cone_select1(n_a,nsub,ra1,ra2,decl1,decl2,dc1,dc2,direct_repl,direct_out)
    ! subroutine to select all the galaxies inside the light cone
    ! n_a: numbe rof replicatins per axis
    ! nsub: numbe rof subvolumes
    ! ra1: right ascention 1
    ! ra2: right ascention 2
    ! decl 1: declination 1
    ! decl 2: declination 2
    ! dc1 : comoving distance lower limit
    ! dc2 : comoving distance upper limit
    ! direct_repl: dircetory of the replications

        integer, intent(in) :: n_a,nsub
        real, intent(in) :: ra1,ra2,decl1,decl2,dc1,dc2
        integer :: i,j,k,l,light_handle,fcount,dims(1),rank,ifile,ilight,fcount_c,count_l
        real, dimension(:),allocatable :: dc,ra,decl,mhalo,mcold,mcold_atom,mcold_mol,mstars_disk,mstars_bulge,halo_r_virial,rdisk,rbulge,xgal,ygal,zgal,mcold_burst,strc
        real, dimension(:),allocatable :: dc_f,ra_f,decl_f,mhalo_f,mcold_f,mcold_atom_f,mcold_mol_f,mstars_disk_f,mstars_bulge_f,halo_r_virial_f,rdisk_f,rbulge_f,xgal_f,ygal_f,zgal_f,mcold_burst_f,strc_f
        logical, dimension(:), allocatable :: mask_ra,mask_decl,mask_dc,mask_f
        integer, dimension(:), allocatable :: gal_type,gal_type_f
        character(*),  intent(in) :: direct_repl,direct_out
        character(2) :: temp_i,temp_j,temp_k,temp_subv

        call hdf5_create_file(light_handle,direct_out)
        call hdf5_create_group(light_handle,"light_cone")
        call hdf5_create_group(light_handle,"light_cone/Output")
        count_l=0
        do i=-n_a,n_a
            write(temp_i,"(I2)") i
            do j=-n_a,n_a
                write(temp_j,"(I2)") j
                do k=-n_a,n_a
                    write(temp_k,"(I2)") k
                    fcount_c=1
                    do l=40,nsub-1
                        write(temp_subv,"(I2)") l
                        call hdf5_open_file(ifile,direct_repl//'rep_'//trim(adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/galaxies.hdf5',.false.)

                        call hdf5_get_dimensions(ifile,'Output001/mhalo',rank,dims)
                        allocate(mhalo(dims(1)),mcold(dims(1)),mcold_atom(dims(1)),mcold_mol(dims(1)),mstars_disk(dims(1)),mstars_bulge(dims(1)),halo_r_virial(dims(1)),rdisk(dims(1)),rbulge(dims(1)),xgal(dims(1)),ygal(dims(1)),zgal(dims(1)),dc(dims(1)),ra(dims(1)),decl(dims(1)),gal_type(dims(1)),mcold_burst(dims(1)),strc(dims(1)))

                        allocate(mask_ra(dims(1)),mask_decl(dims(1)),mask_dc(dims(1)),mask_f(dims(1)))
                        call hdf5_read_data(ifile,'Output001/dc',dc)
                        print *, "C.L ctm!"
                        call hdf5_read_data(ifile,'Output001/ra',ra)
                        call hdf5_read_data(ifile,'Output001/decl',decl)
                        !print *, "ra =",ra
                        mask_ra  =(ra1 <=ra ) .and. (ra<= ra2)
                        mask_decl=(decl1 <= decl) .and. (decl <= decl2)
                        mask_dc  =(dc1 <= dc ) .and. (dc <= dc2)
                        mask_f   =(mask_ra .and. mask_decl .and. mask_dc )
                        fcount   =int(count(mask_f))
                        call hdf5_read_data(ifile,'Output001/mhalo',mhalo)
                        call hdf5_read_data(ifile,'Output001/mcold',mcold)
                        call hdf5_read_data(ifile,'Output001/mcold_atom',mcold_atom)
                        call hdf5_read_data(ifile,'Output001/mcold_mol',mcold_mol)
                        call hdf5_read_data(ifile,'Output001/mstars_disk',mstars_disk)
                        call hdf5_read_data(ifile,'Output001/mstars_bulge',mstars_bulge)
                        call hdf5_read_data(ifile,'Output001/halo_r_virial',halo_r_virial)
                        call hdf5_read_data(ifile,'Output001/rdisk',rdisk)
                        call hdf5_read_data(ifile,'Output001/rbulge',rbulge)
                        call hdf5_read_data(ifile,'Output001/xgal_t',xgal)
                        call hdf5_read_data(ifile,'Output001/ygal_t',ygal)
                        call hdf5_read_data(ifile,'Output001/zgal_t',zgal)
                        call hdf5_read_data(ifile,'Output001/type',gal_type)
                        call hdf5_read_data(ifile,'Output001/mcold_burst',mcold_burst)
                        call hdf5_read_data(ifile,'Output001/strc',strc)
                        !call hdf5_read_data(ifile,'Output001/dc',dc)                      
                        !call hdf5_read_data(ifile,'Output001/ra',ra)
                        !call hdf5_read_data(ifile,'Output001/decl',decl)                      
 
                        allocate(mhalo_f(fcount),mcold_f(fcount),mcold_atom_f(fcount),mcold_mol_f(fcount),mstars_disk_f(fcount),mstars_bulge_f(fcount),halo_r_virial_f(fcount),rdisk_f(fcount),rbulge_f(fcount),xgal_f(fcount),ygal_f(fcount),zgal_f(fcount),dc_f(fcount),ra_f(fcount),decl_f(fcount),gal_type_f(fcount)) 
                        call hdf5_close_file(ifile)
                        mhalo_f        =pack(mhalo,mask_f)        
                        mcold_f        =pack(mcold,mask_f)       
                        mcold_atom_f   =pack(mcold_atom,mask_f)   
                        mcold_mol_f    =pack(mcold_mol,mask_f)    
                        mstars_disk_f  =pack(mstars_disk,mask_f)  
                        mstars_bulge_f =pack(mstars_bulge,mask_f) 
                        halo_r_virial_f=pack(halo_r_virial,mask_f)
                        rdisk_f        =pack(rdisk,mask_f)        
                        rbulge_f       =pack(rbulge,mask_f)       
                        xgal_f         =pack(xgal,mask_f)         
                        ygal_f         =pack(ygal,mask_f)         
                        zgal_f         =pack(zgal,mask_f)         
                        gal_type_f     =pack(gal_type,mask_f)     
                        mcold_burst_f  =pack(mcold_burst,mask_f)  
                        strc_f         =pack(strc,mask_f)          
                        dc_f           =pack(dc,mask_f)
                        ra_f           =pack(ra,mask_f)
                        decl_f         =pack(decl,mask_f)

                        deallocate(mhalo,mcold,mcold_atom,mcold_mol,mstars_disk,mstars_bulge,halo_r_virial,rdisk,rbulge,xgal,ygal,zgal,dc,ra,decl,gal_type,mcold_burst,strc)
                           
                        print *, "number =",count_l,", count=",fcount

                        !fcount=fcount-1
                        print *, "star =",fcount_c
                        print *, "count=",fcount
                        if (fcount .gt. 0) then
                            call hdf5_write_data(light_handle,'light_cone/Output/mhalo',mhalo_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/mcold',mcold_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/mcold_atom',mcold_atom_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/mcold_mol',mcold_mol_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/mstars_disk',mstars_disk_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/mstars_bulge',mstars_bulge_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/halo_r_virial',halo_r_virial_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/rdisk',rdisk_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/rbulge',rbulge_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/xgal',xgal_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/ygal',ygal_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/zgal',zgal_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/type',gal_type_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/mcold_burst',mcold_burst_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/strc',strc_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/dc',dc_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))                      
                            call hdf5_write_data(light_handle,'light_cone/Output/ra',ra_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))
                            call hdf5_write_data(light_handle,'light_cone/Output/decl',decl_f,overwrite=.true.,extensible=.true.,start=(/fcount_c/),count=(/fcount/))                      

                        endif
                        fcount_c     = fcount_c+fcount  
                        
                        deallocate(mhalo_f,mcold_f,mcold_atom_f,mcold_mol_f,mstars_disk_f,mstars_bulge_f,halo_r_virial_f,rdisk_f,rbulge_f,xgal_f,ygal_f,zgal_f,dc_f,ra_f,decl_f,gal_type_f,mcold_burst_f,strc_f)
                        deallocate(mask_ra,mask_decl,mask_dc,mask_f)
                        print *, "Lightcone working in: ",direct_repl//'rep_'//trim(adjustl(temp_i))//'_'//trim(adjustl(temp_j))//'_'//trim(adjustl(temp_k))//'/ivol'//trim(adjustl(temp_subv))//'/galaxies.hdf5'
                        count_l=count_l+1

                    enddo
               enddo
            enddo       
        enddo
        call hdf5_close_file(light_handle)
    end subroutine light_cone_select1 
    pure subroutine translation(x,y,z,n,ngal,lbox,x_new,y_new,z_new)
    ! subroutine to perform change of coordinates due to replications
    ! x,y,z: coordinate sof galaxies previous to replications
    ! n: number of replication
    ! ngal: number of galaxies
    ! lbox: box size
    ! x_new,y_new,z_new: new coordinates
        integer, intent(in) :: n,ngal
        real,dimension(ngal), intent(in)    :: x,y,z
        real, intent(in) :: lbox
        real,dimension(ngal), intent(out)   :: x_new,y_new,z_new
        x_new=x+lbox*real(n)
        y_new=y+lbox*real(n)
        z_new=z+lbox*real(n)
    end subroutine translation
    pure subroutine spherical(x,y,z,ra,decl)
    ! subroutine to convert from cartesian coordinates to cilyndrical coordinates
    ! cartesian coordinates : x,y,z
    ! spherical coordinates : ra:right ascension, decl: declination
        real, intent(in) :: x,y,z
        real, intent(out) :: ra,decl
        real :: d

        d=sqrt(x**2.0+y**2.0+z**2.0)
        decl=asin(z/d)
        ra=atan2(y,x) 

    end subroutine spherical 
    
end module lc_helper
program main_light_cones
    use lc_helper
    use hdf5_wrapper
    !use HDF5_Utils_Module
    !The main program for generating lightcones
    !ux_obs,uy_obs,uz_obs:these are the unit vectors that define the direction of the line of sight of the lightcone
    ! theta_open: the opening angle of the light cone
    implicit none
    !simulation box inputs
    real :: lbox !box size of the simulation
    integer :: nsub_vols ! number of subvolumes the boxsize
    ! Survey inputs
    real :: l_survey ! comoving distance of the survey 
    ! observational inputs
    integer :: n_obs ! number of observers
    real :: x_obs,y_obs,z_obs,ux_obs,uy_obs,uz_obs,theta_open
    
    ! replications variables
    integer :: n_rep_a,n_rep_t !number of replications to be done
    logical :: log_rep !logical to confirm replications
    integer :: rep_count !variable to count the replications
    
    real, allocatable, dimension(:)    :: x_i,y_i,z_i,x_new,y_new,z_new
    ! adress of folders 
    character(*), parameter :: direct1='/home/ggranda/galform_out/v2.6.0/aquarius_trees/micro-surfs-velociraptor/surfs-midi/iz200/' ! directory of the folder that contains all the subvolumes
    character(*), parameter :: direct2='/mnt/su3ctm/ggranda/cp_test2/' ! directory containing the replications
    character(5), parameter :: redshift='iz200'
    ! translation variable
    logical :: log_translation
    ! computaion variables
    integer :: i,j,k,l,rank,dims(1)
    character(3) :: temp_rep,temp_subv
    logical       :: readonly
    ! lightcone select 
    real :: ra1,ra2
    real :: decl1,decl2
    real :: dc1,dc2
    character(*), parameter :: direct_out='/mnt/su3ctm/ggranda/lightcones/lightcones.hdf5'
    logical :: log_lightcone 
    !###########################HArdcode###################################
    n_obs=1
    l_survey= 100.0 ! the volume of the surey uin Mpc^3
    lbox= 100.0      ! the box size of the simulation in Mpc
    nsub_vols=64
    log_rep=.false.  ! log to decide copying or not
    readonly=.false. ! log to decide readonly 
    log_translation=.true.
    log_lightcone = .false.
    ra1=-2.836
    ra2=2.860
    decl1=-2.836
    decl2=2.836
    dc1=10.0
    dc2=2000.0
    !#################### CODE ####################################################
    n_rep_a=(nint(l_survey/lbox+1)) !n_rep_a: number of replications per axis
    n_rep_t=int((2*n_rep_a+1)**3.0) !n_rep_t: total number of replications
    !call system('ls '// direct1) 
    n_rep_t=2 ! just to test code
    if (log_rep) then
        call replications(n_rep_t,direct1,direct2)
    endif
    rep_count=1

    if (log_translation) then
        call replications_translation(n_rep_a,nsub_vols,lbox,direct2)
    endif
    
    if (log_lightcone) then
        call light_cone_select1(n_rep_a,nsub_vols,ra1,ra2,decl1,decl2,dc1,dc2,direct2,direct_out)  
    endif
     
end program main_light_cones

Attachment: Makefile
Description: Binary data

_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

Reply via email to