Hi all! I had some trouble implementing the grid calculation for phonon calculation, which splits the calculation over q and representations.
I have spend last few days try to do the splitting following the run_example script in GRID_example folder. I am running the QE V4.3 on HPC machine (IBM Blade Center Linux Cluster, henry2, 981 dual Xeon compute nodes with Intel Xeon Processors (mix of single-, dual-, quad-, and six- core), 2-4GB per core distributed memory, dual gigabit Ethernet interconnects) 8-16 cpus are assigned to each splitted jobs After some calculations, I found that the splitting is actually not feasible on HPC! Before you can split the job over q points and representations, you have to copy PREFIX.save (which is generated by scf calculation) to each sub directories where you want to do the ph calculation separately. If the system is slightly large (more 20 atoms in unit cell), the PREFIX.save will be several GBs, then let's say if you want to split over 3 q points and 48 representations, you have to copy the PREFIX.save files 3x48=144 times! (144x 3GB + 144 X 5GB > 1TB) I had a script doing this job, but it quickly used up the disk space (it used more 1TB )! which made this approach unusable on HPC machine! Is there anyway to do the splitting phonon calculation without copying the PREFIX.save generated by scf to each sub directories? Can we let all the splitted phonon calculations look for the same input directory for PREFIX.save so that we do not need to copy files around? Any comment on how to do the splitting job without using up the disk space will be greatly appreciated! Thank you, Rui Mao Part of the script that copy the PREFIX.save files generated by scf to each sub directories: mkdir $TMP_DIR/$q.$irr cp -r $TMP_DIR/$PREFIX.save $TMP_DIR/$q.$irr mkdir -p $TMP_DIR/$q.$irr/_ph0/$PREFIX.phsave cp -r $TMP_DIR/_ph0/$PREFIX.phsave/* $TMP_DIR/$q.$irr/_ph0/$PREFIX.phsave Scf putfile: &control calculation = 'scf', restart_mode='from_scratch', wf_collect=.true., prefix='graphite', tstress = .true., tprnfor = .true., forc_conv_thr = 1.0D-3, etot_conv_thr = 1.0D-4, pseudo_dir = '/home/rmao/Pseudo/', outdir='/share2/Ray/grid_graphite/scf', / &system ibrav=4, celldm(1) =9.2961, celldm(3)=1.3579, nat=16, ntyp= 1, ecutwfc =70, ecutrho=700, occupations='smearing', smearing='mp', degauss=0.003, / &electrons diagonalization='cg', mixing_mode = 'plain', mixing_beta = 0.5, conv_thr = 1.0d-8, / ATOMIC_SPECIES C 12.01070 C.pz-rrkjus.UPF ATOMIC_POSITIONS {angstrom} C 1.183416184 0.709967199 -3.339999736 C 1.183397032 2.130056591 -3.339999831 C 2.413378375 2.839955863 -3.339999742 C -0.046321206 2.840033775 -3.339999742 C 2.413196667 4.260044397 -3.339999829 C -0.046512842 4.260108061 -3.339999831 C 1.183461168 4.970022576 -3.339999741 C 3.643012948 2.129829924 -3.339999823 C 0.000008073 -0.000036198 -0.000000172 C -0.000009822 1.420051269 -0.000000260 C 1.229970973 2.129951526 -0.000000171 C -1.229729485 2.130029208 -0.000000172 C 1.229791025 3.550039979 -0.000000260 C -1.229918926 3.550102386 -0.000000260 C 0.000053606 4.260017465 -0.000000171 C 2.459606230 1.419825978 -0.000000260 K_POINTS {automatic} 8 8 16 0 0 0 Ph inputfile phonons of Graphite_2x2 &inputph tr2_ph=1.0d-16, alpha_mix(1)=0.5, prefix='graphite', ldisp=.true., epsil=.false., recover=.false., reduce_io=.false., amass(1)=12.01070, start_irr=0, last_irr=0, nq1=1, nq2=1, nq3=4, outdir='/share2/Ray/grid_graphite/scf', fildyn='graphite.dyn' / -- Rui Mao ============================================================ Department of Electrical and Computer Engineering (ECE) North Carolina State University (NCSU) Raleigh, NC, 27606 Email: rmao at ncsu.edu Email: ruimao20 at gmail.com ============================================================ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.democritos.it/pipermail/pw_forum/attachments/20120515/d44aa498/attachment.htm