Hello,

1) Ah, I did not notice that.  I generally suppress I/O for test jobs but one I 
did a followup phonon calculation to test timings on that so I/O was turned on. 
 The timing difference persists regardless of the I/O and also persisted into 
the phonon calculation (1hr//1.5hr roughly)

2) I have run it multiple times and timing is reproducible.  Timing issue 
exists in ph.x as well as mentioned above.  I have 16 physical cores and 32 
hyperthreads.

3) I have run with mpirun -np 1 which is what I think 1 MPI rank means.  The 
cpu/wall timings are much more consistent, but I must confess that I am not 
experienced enough to understand what this result indicates is causing my issue.

Thanks,
Brad

--------------------------------------------------------
Bradly Baer
Graduate Research Assistant, Walker Lab
Interdisciplinary Materials Science
Vanderbilt University


________________________________
From: users <users-boun...@lists.quantum-espresso.org> on behalf of Ye Luo 
<xw111lu...@gmail.com>
Sent: Thursday, June 3, 2021 4:22 PM
To: Quantum ESPRESSO users Forum <users@lists.quantum-espresso.org>
Subject: Re: [QE-users] Compiling with Intel's OneAPI - Parallel Performance 
Issues

Hi Brad,
1. Your output files differ. One says 'Writing output data file ./pwscf.save' 
one doesn't. Does one have I/O and one doesn't?
2. Your simulation so small and also you are running 16 MPI ranks, so largely 
exercising MPI overhead. Run it a couple times and see if the timing is 
reproducible. Does your machine have 16 physical cores or 8 cores 16 
hyperthreads?
3. To validate it is actually a compiler regression, run with 1 MPI rank and 
compare the timing.

Ye
===================
Ye Luo, Ph.D.
Computational Science Division & Leadership Computing Facility
Argonne National Laboratory


On Thu, Jun 3, 2021 at 3:36 PM Baer, Bradly 
<bradly.b.b...@vanderbilt.edu<mailto:bradly.b.b...@vanderbilt.edu>> wrote:
Hello Users,

I have a working QE6.7 install built with Intel's Parallel Studios from 2020.  
I want to compile the d3q code but I have found that my parallel studios 
license is expired, and I must switch to Intel's new OneAPI distribution to 
continue using ifort, icc etc.  I have configured everything in the same way as 
with the parallel studios version, including using the same make.inc file, but 
my parallel performance is very poor when using the OneAPI version.

Attached are my make.inc file that I used for both compiles and an example 
output file  using pw.x compiled with parallel studios and OneAPI. The parallel 
studios calculation had a CPU//wall time of 5.89s//5.97s but the  OneAPI 
version has almost a 50% performance loss and shows times of 5.92s//8.71s.  
Both were made using the same inputs.

Has anyone had experience compiling with the new OneAPI versions of things?  
Have I missed some small but important change in how the libraries are linked 
now?

Thanks,
Brad

--------------------------------------------------------
Bradly Baer
Graduate Research Assistant, Walker Lab
Interdisciplinary Materials Science
Vanderbilt University


_______________________________________________
Quantum ESPRESSO is supported by MaX 
(www.max-centre.eu<https://nam04.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.max-centre.eu%2F&data=04%7C01%7Cbradly.b.baer%40vanderbilt.edu%7Cebb26f1903cc4b988a4c08d926d5b9a4%7Cba5a7f39e3be4ab3b45067fa80faecad%7C0%7C1%7C637583521651058549%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=VC%2B7hzIdNUfdKmK33zLvgMlJ054sVVnG8PMDknhuHxA%3D&reserved=0>)
users mailing list 
users@lists.quantum-espresso.org<mailto:users@lists.quantum-espresso.org>
https://lists.quantum-espresso.org/mailman/listinfo/users<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.quantum-espresso.org%2Fmailman%2Flistinfo%2Fusers&data=04%7C01%7Cbradly.b.baer%40vanderbilt.edu%7Cebb26f1903cc4b988a4c08d926d5b9a4%7Cba5a7f39e3be4ab3b45067fa80faecad%7C0%7C1%7C637583521651068545%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=itDrMarLfOmf7HCGKhnuMIPbOofSV6gPIys%2BqJxxE%2Bo%3D&reserved=0>

Attachment: MPI1ParallelStudio.out
Description: MPI1ParallelStudio.out

Attachment: MPI1OneAPI.out
Description: MPI1OneAPI.out

_______________________________________________
Quantum ESPRESSO is supported by MaX (www.max-centre.eu)
users mailing list users@lists.quantum-espresso.org
https://lists.quantum-espresso.org/mailman/listinfo/users

Reply via email to