Re: [Pw_forum] Using -nimage with phonon at q=0

2015-08-06 Thread Andrea Dal Corso
On Thu, 2015-08-06 at 14:00 +0200, Merlin Meheut wrote:
> On 05/08/2015 18:50, Andrea Dal Corso wrote:
> > On Wed, 2015-08-05 at 17:18 +0200, Merlin Meheut wrote:
> >> Dear PWSCF users,
> >>
> >> I recently discovered with great interest the possibilities to
> >> parallelize phonon calculations using the -nimage option of ph.x.
> >> (example given in espresso-4.3.2/examples/GRID_examples).
> >>
> > GRID examples refer to grid splitting, meaning that you split the
> > calculation using start_irr, last_irr etc. and in principle the
> > calculation can run in different machines. Then you have to collect the
> > files  yourself to finally obtain the results.
> 
> Sorry I mixed up both. Actually, I used -nimage (as explained in 
> Doc/INPUT_PH.txt of version 4.3.2), but I would like to restart with 
> grid splitting.
> Thank you for the reference to the example.
> 
> >> However, I had a problem when performing calculations at gamma-point:
> >> for other q-points (therefore with epsil=.false. and zue=.false.)
> >> everything went as planned, but with q=0 (and epsil=.true. and
> >> zue=.true.), this just did not work. I took 80 processors divided into 4
> >> images, and instead of dividing the different representations into 4
> >> pools, the four groups of processors realized the same calculation,
> >> computing the same representations. I killed the calculation at some
> >> point (I have computed the electric fields, effective charges and 218
> >> representations out of 564). I would like now to finish the computation
> >> without redoing it, and I have several questions to achieve this goal:
> >>
> > Are you sure that all the images made the same phonon calculations, or
> > all images made the electric field calculation but the phonon modes were
> > different?
> 
> The names of the partial dynamical matrices (dynmat.1.$irr.xml) are the 
> same, the files (e.g. _ph0/LiClMag2-1.phsave/dynmat.1.100.xml and
> _ph1/LiClMag2-1.phsave/dynmat.1.100.xml) are almost identical, and the 
> calculation lasted more than it should have (each of the 4 separate 
> images has computed more than 141 representations, which is one quarter 
> of the total). So as far as I understand it, all the images made 
> electric field calculation and started making the same phonon mode 
> calculations in parallel.
> 

Thank you for reporting this. What is not supported is the option
-nimage together with gamma-gamma tricks that you are using in the gamma
calculation.
One easy solution is to set nogg=.TRUE. in the input of the phonon if
you do not need the gamma-gamma tricks to speed-up the calculation.


Andrea




> > In this case, since you stopped the calculation, you can only
> > collect the .xml files in a single _ph0 directory and restart without
> > images. The restart with images is still poorly supported.
> 
>   I would like to restart without -nimage (since it seems to fail in 
> this particular case), but with grid splitting. Do you think this is 
> presomptuous?
> By the way, I had a positive experience restarting with -nimage on 
> another run (at q neq 0) .
> 
> > In the last version of QE, the dielectric constant and effective charges
> > are saved in the tensors.xml file in the _ph0 directory.
> 
> Thank you very much for your help!
> 
> Best regards,
> 
> 
> -- 
> Merlin Méheut, Géosciences et Environnement Toulouse,
> OMP, 14 avenue Edouard Belin, 31400 Toulouse, France
> 
> phone +33 (0)5 61 33 26 17, fax +33 (0)5 61 33 25 60
> 
> ___
> Pw_forum mailing list
> Pw_forum@pwscf.org
> http://pwscf.org/mailman/listinfo/pw_forum

-- 
Andrea Dal CorsoTel. 0039-040-3787428
SISSA, Via Bonomea 265  Fax. 0039-040-3787249
I-34136 Trieste (Italy) e-mail: dalco...@sissa.it


___
Pw_forum mailing list
Pw_forum@pwscf.org
http://pwscf.org/mailman/listinfo/pw_forum

Re: [Pw_forum] Using -nimage with phonon at q=0

2015-08-06 Thread Mahya Zare
If the input file is correct?

On Thu, Aug 6, 2015 at 4:30 PM, Merlin Meheut 
wrote:

> On 05/08/2015 18:50, Andrea Dal Corso wrote:
> > On Wed, 2015-08-05 at 17:18 +0200, Merlin Meheut wrote:
> >> Dear PWSCF users,
> >>
> >> I recently discovered with great interest the possibilities to
> >> parallelize phonon calculations using the -nimage option of ph.x.
> >> (example given in espresso-4.3.2/examples/GRID_examples).
> >>
> > GRID examples refer to grid splitting, meaning that you split the
> > calculation using start_irr, last_irr etc. and in principle the
> > calculation can run in different machines. Then you have to collect the
> > files  yourself to finally obtain the results.
>
> Sorry I mixed up both. Actually, I used -nimage (as explained in
> Doc/INPUT_PH.txt of version 4.3.2), but I would like to restart with
> grid splitting.
> Thank you for the reference to the example.
>
> >> However, I had a problem when performing calculations at gamma-point:
> >> for other q-points (therefore with epsil=.false. and zue=.false.)
> >> everything went as planned, but with q=0 (and epsil=.true. and
> >> zue=.true.), this just did not work. I took 80 processors divided into 4
> >> images, and instead of dividing the different representations into 4
> >> pools, the four groups of processors realized the same calculation,
> >> computing the same representations. I killed the calculation at some
> >> point (I have computed the electric fields, effective charges and 218
> >> representations out of 564). I would like now to finish the computation
> >> without redoing it, and I have several questions to achieve this goal:
> >>
> > Are you sure that all the images made the same phonon calculations, or
> > all images made the electric field calculation but the phonon modes were
> > different?
>
> The names of the partial dynamical matrices (dynmat.1.$irr.xml) are the
> same, the files (e.g. _ph0/LiClMag2-1.phsave/dynmat.1.100.xml and
> _ph1/LiClMag2-1.phsave/dynmat.1.100.xml) are almost identical, and the
> calculation lasted more than it should have (each of the 4 separate
> images has computed more than 141 representations, which is one quarter
> of the total). So as far as I understand it, all the images made
> electric field calculation and started making the same phonon mode
> calculations in parallel.
>
> > In this case, since you stopped the calculation, you can only
> > collect the .xml files in a single _ph0 directory and restart without
> > images. The restart with images is still poorly supported.
>
>   I would like to restart without -nimage (since it seems to fail in
> this particular case), but with grid splitting. Do you think this is
> presomptuous?
> By the way, I had a positive experience restarting with -nimage on
> another run (at q neq 0) .
>
> > In the last version of QE, the dielectric constant and effective charges
> > are saved in the tensors.xml file in the _ph0 directory.
>
> Thank you very much for your help!
>
> Best regards,
>
>
> --
> Merlin Méheut, Géosciences et Environnement Toulouse,
> OMP, 14 avenue Edouard Belin, 31400 Toulouse, France
>
> phone +33 (0)5 61 33 26 17, fax +33 (0)5 61 33 25 60
>
> ___
> Pw_forum mailing list
> Pw_forum@pwscf.org
> http://pwscf.org/mailman/listinfo/pw_forum


relax.in
Description: Binary data
___
Pw_forum mailing list
Pw_forum@pwscf.org
http://pwscf.org/mailman/listinfo/pw_forum

Re: [Pw_forum] Using -nimage with phonon at q=0

2015-08-06 Thread Merlin Meheut
On 05/08/2015 18:50, Andrea Dal Corso wrote:
> On Wed, 2015-08-05 at 17:18 +0200, Merlin Meheut wrote:
>> Dear PWSCF users,
>>
>> I recently discovered with great interest the possibilities to
>> parallelize phonon calculations using the -nimage option of ph.x.
>> (example given in espresso-4.3.2/examples/GRID_examples).
>>
> GRID examples refer to grid splitting, meaning that you split the
> calculation using start_irr, last_irr etc. and in principle the
> calculation can run in different machines. Then you have to collect the
> files  yourself to finally obtain the results.

Sorry I mixed up both. Actually, I used -nimage (as explained in 
Doc/INPUT_PH.txt of version 4.3.2), but I would like to restart with 
grid splitting.
Thank you for the reference to the example.

>> However, I had a problem when performing calculations at gamma-point:
>> for other q-points (therefore with epsil=.false. and zue=.false.)
>> everything went as planned, but with q=0 (and epsil=.true. and
>> zue=.true.), this just did not work. I took 80 processors divided into 4
>> images, and instead of dividing the different representations into 4
>> pools, the four groups of processors realized the same calculation,
>> computing the same representations. I killed the calculation at some
>> point (I have computed the electric fields, effective charges and 218
>> representations out of 564). I would like now to finish the computation
>> without redoing it, and I have several questions to achieve this goal:
>>
> Are you sure that all the images made the same phonon calculations, or
> all images made the electric field calculation but the phonon modes were
> different?

The names of the partial dynamical matrices (dynmat.1.$irr.xml) are the 
same, the files (e.g. _ph0/LiClMag2-1.phsave/dynmat.1.100.xml and
_ph1/LiClMag2-1.phsave/dynmat.1.100.xml) are almost identical, and the 
calculation lasted more than it should have (each of the 4 separate 
images has computed more than 141 representations, which is one quarter 
of the total). So as far as I understand it, all the images made 
electric field calculation and started making the same phonon mode 
calculations in parallel.

> In this case, since you stopped the calculation, you can only
> collect the .xml files in a single _ph0 directory and restart without
> images. The restart with images is still poorly supported.

  I would like to restart without -nimage (since it seems to fail in 
this particular case), but with grid splitting. Do you think this is 
presomptuous?
By the way, I had a positive experience restarting with -nimage on 
another run (at q neq 0) .

> In the last version of QE, the dielectric constant and effective charges
> are saved in the tensors.xml file in the _ph0 directory.

Thank you very much for your help!

Best regards,


-- 
Merlin Méheut, Géosciences et Environnement Toulouse,
OMP, 14 avenue Edouard Belin, 31400 Toulouse, France

phone +33 (0)5 61 33 26 17, fax +33 (0)5 61 33 25 60

___
Pw_forum mailing list
Pw_forum@pwscf.org
http://pwscf.org/mailman/listinfo/pw_forum

Re: [Pw_forum] Using -nimage with phonon at q=0

2015-08-05 Thread Andrea Dal Corso
On Wed, 2015-08-05 at 17:18 +0200, Merlin Meheut wrote:
> Dear PWSCF users,
> 
> I recently discovered with great interest the possibilities to 
> parallelize phonon calculations using the -nimage option of ph.x. 
> (example given in espresso-4.3.2/examples/GRID_examples).
> 
GRID examples refer to grid splitting, meaning that you split the
calculation using start_irr, last_irr etc. and in principle the
calculation can run in different machines. Then you have to collect the
files  yourself to finally obtain the results.

Image splitting with the -nimage option is in Image_example. In this
case the calculation is split automatically, but at the end you need to
do another calculation without images to collect the results. 
Please use QE 5.2 if you want to use these features.


> However, I had a problem when performing calculations at gamma-point: 
> for other q-points (therefore with epsil=.false. and zue=.false.) 
> everything went as planned, but with q=0 (and epsil=.true. and 
> zue=.true.), this just did not work. I took 80 processors divided into 4 
> images, and instead of dividing the different representations into 4 
> pools, the four groups of processors realized the same calculation, 
> computing the same representations. I killed the calculation at some 
> point (I have computed the electric fields, effective charges and 218 
> representations out of 564). I would like now to finish the computation 
> without redoing it, and I have several questions to achieve this goal:
> 

Are you sure that all the images made the same phonon calculations, or
all images made the electric field calculation but the phonon modes were
different? In this case, since you stopped the calculation, you can only
collect the .xml files in a single _ph0 directory and restart without
images. The restart with images is still poorly supported.

> - is there a particular procedure for using -nimage with epsil=.true. 
> and zue=.true., or is that just not foreseen? In other word, did I miss 
> something?
> - following the same idea, if I want to build my dynamical matrix, with 
> effective charges and dielectric tensor, by a ph.x run with 
> "recover=.true.", can I do it and if I can, what files do I need in 
> _ph0? In particular, what are the files that contain the information on 
> dielectric tensor and effective charges? In other words, are there 
> special guidelines in supplement to the ones given in 
> espresso-4.3.2/Doc/INPUT_PH.txt to separate  the phonon calculation in 
> several jobs, when we consider a calculation with epsil=.true. and zue 
> =.true. ?

In the last version of QE, the dielectric constant and effective charges
are saved in the tensors.xml file in the _ph0 directory. 

HTH,

Andrea


> (  ADDITIONAL INFORMATION lines 562-end)
> 
> Thank you in advance for any help,
> 
> the version of qe is 5.1
> 
> I did first a scf calculation on 20 processors:
> 
> -scf input file --
>   &control
> calculation = 'scf',
>restart_mode = 'from_scratch' ,
>  prefix = 'LiClMag2-1',
> disk_io = 'default' ,
>  pseudo_dir = '$WORKDIR',
>  outdir = '${WORKDIR}',
>  tprnfor= .true.,
>  tstress= .true.,
> /&end
>   &system
>  ibrav = 0, celldm(1)=23.3535,
>  nat = 188, ntyp = 4, ecutwfc = ${a}.0, ecutrho=${b}.0
> /&end
>   &electrons
> electron_maxstep = 100,
>conv_thr = 1.d-8,
> mixing_mode = 'plain',
> startingwfc = 'atomic',
> mixing_beta = 0.5,
> /&end
> ATOMIC_SPECIES
>Li7.0160   Li.blyp-sl-rrkjus_psl.1.0.0.UPF
>O15.9949   O.blyp.UPF
>H 1.0079   H.blyp2.UPF
>Cl   34.9689   Cl.blyp-nl-rrkjus_psl.1.0.0.UPF
> 
> ATOMIC_POSITIONS (angstrom)
> (...)
> K_POINTS {crystal}
>   1
>   0.0 0.0 0.0 1
> 
> CELL_PARAMETERS { cubic }
>1.0   0.00.
>0.0   1.00.
>0.0   0.01.
> 
> 
> the scf was run on 20 processors
> 
> srun  pw.x -npool 1 < scf.${PREFIX}.inp > scf.${PREFIX}.out
> 
> the ph input is :
> --
>   &inputph
> amass(1)= 7.0160,
> amass(2)=15.9949,
> amass(3)= 1.0079,
> amass(4)=34.9689,
> ! ldisp=.true., nq1=2, nq2=2, nq3=2,
> alpha_mix(1) = 0.7,
> tr2_ph =  1.0D-18,
> prefix='LiClMag2-1',
> fildyn='mat.$PREFIX',
> epsil =.true.,
> trans =.true.,
> zue = .true.,
> lraman=.false.,
> outdir = '$WORKDIR',
> ! max_seconds= 18,
> /&end
>   0.000   0.000   0.000
> ---
> 
> It was run on 20 processors using:
> 
> srun ph.x -npool 1 -nimage 4 <  ph.${PREFIX}.inp > ph.${PREFIX}.out
> 
> 
> -- 
> Merlin Méheut, Géo

[Pw_forum] Using -nimage with phonon at q=0

2015-08-05 Thread Merlin Meheut

Dear PWSCF users,

I recently discovered with great interest the possibilities to 
parallelize phonon calculations using the -nimage option of ph.x. 
(example given in espresso-4.3.2/examples/GRID_examples).

However, I had a problem when performing calculations at gamma-point: 
for other q-points (therefore with epsil=.false. and zue=.false.) 
everything went as planned, but with q=0 (and epsil=.true. and 
zue=.true.), this just did not work. I took 80 processors divided into 4 
images, and instead of dividing the different representations into 4 
pools, the four groups of processors realized the same calculation, 
computing the same representations. I killed the calculation at some 
point (I have computed the electric fields, effective charges and 218 
representations out of 564). I would like now to finish the computation 
without redoing it, and I have several questions to achieve this goal:

- is there a particular procedure for using -nimage with epsil=.true. 
and zue=.true., or is that just not foreseen? In other word, did I miss 
something?
- following the same idea, if I want to build my dynamical matrix, with 
effective charges and dielectric tensor, by a ph.x run with 
"recover=.true.", can I do it and if I can, what files do I need in 
_ph0? In particular, what are the files that contain the information on 
dielectric tensor and effective charges? In other words, are there 
special guidelines in supplement to the ones given in 
espresso-4.3.2/Doc/INPUT_PH.txt to separate  the phonon calculation in 
several jobs, when we consider a calculation with epsil=.true. and zue 
=.true. ?
(  ADDITIONAL INFORMATION lines 562-end)

Thank you in advance for any help,

the version of qe is 5.1

I did first a scf calculation on 20 processors:

-scf input file --
  &control
calculation = 'scf',
   restart_mode = 'from_scratch' ,
 prefix = 'LiClMag2-1',
disk_io = 'default' ,
 pseudo_dir = '$WORKDIR',
 outdir = '${WORKDIR}',
 tprnfor= .true.,
 tstress= .true.,
/&end
  &system
 ibrav = 0, celldm(1)=23.3535,
 nat = 188, ntyp = 4, ecutwfc = ${a}.0, ecutrho=${b}.0
/&end
  &electrons
electron_maxstep = 100,
   conv_thr = 1.d-8,
mixing_mode = 'plain',
startingwfc = 'atomic',
mixing_beta = 0.5,
/&end
ATOMIC_SPECIES
   Li7.0160   Li.blyp-sl-rrkjus_psl.1.0.0.UPF
   O15.9949   O.blyp.UPF
   H 1.0079   H.blyp2.UPF
   Cl   34.9689   Cl.blyp-nl-rrkjus_psl.1.0.0.UPF

ATOMIC_POSITIONS (angstrom)
(...)
K_POINTS {crystal}
  1
  0.0 0.0 0.0 1

CELL_PARAMETERS { cubic }
   1.0   0.00.
   0.0   1.00.
   0.0   0.01.


the scf was run on 20 processors

srun  pw.x -npool 1 < scf.${PREFIX}.inp > scf.${PREFIX}.out

the ph input is :
--
  &inputph
amass(1)= 7.0160,
amass(2)=15.9949,
amass(3)= 1.0079,
amass(4)=34.9689,
! ldisp=.true., nq1=2, nq2=2, nq3=2,
alpha_mix(1) = 0.7,
tr2_ph =  1.0D-18,
prefix='LiClMag2-1',
fildyn='mat.$PREFIX',
epsil =.true.,
trans =.true.,
zue = .true.,
lraman=.false.,
outdir = '$WORKDIR',
! max_seconds= 18,
/&end
  0.000   0.000   0.000
---

It was run on 20 processors using:

srun ph.x -npool 1 -nimage 4 <  ph.${PREFIX}.inp > ph.${PREFIX}.out


-- 
Merlin Méheut, Géosciences et Environnement Toulouse,
OMP, 14 avenue Edouard Belin, 31400 Toulouse, France
Université Paul Sabatier - Toulouse 3

  phone +33 (0)5 61 33 26 17, fax +33 (0)5 61 33 25 60

___
Pw_forum mailing list
Pw_forum@pwscf.org
http://pwscf.org/mailman/listinfo/pw_forum