[gmx-users] Should I use separate PME nodes

2010-06-25 Thread Gaurav Goel
I ran my simulation in parallel on 4 nodes (with zero separate PME
nodes). Below is the information printed in md.log.

I see that PME-Mesh calculations took 60% of CPU time. Any
recommendations on using 1 or more separate PME nodes to speed up?


 Computing: M-Number M-Flops  % Flops
---
 Coul(T) + VdW(T) 1761401.496982   119775301.79520.2
 Outer nonbonded loop  106414.135764 1064141.358 0.2
 Calc Weights   32400.006480 1166400.233 0.2
 Spread Q Bspline 2332800.466560 4665600.933 0.8
 Gather F Bspline 2332800.46656027993605.599 4.7
 3D-FFT  47185929.437184   377487435.49763.6
 Solve PME 675840.13516843253768.651 7.3
 NS-Pairs  823453.92765617292532.481 2.9
 Reset In Box2160.0021606480.006 0.0
 CG-CoM  2160.0043206480.013 0.0
 Virial 11700.002340  210600.042 0.0
 Ext.ens. Update10800.002160  583200.117 0.1
 Stop-CM10800.002160  108000.022 0.0
 Calc-Ekin  10800.004320  291600.117 0.0
---
 Total 593905146.863   100.0
---

  R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

 Computing: Nodes Number G-CyclesSeconds %
---
 Domain decomp. 4101 3859.416 1488.1 0.6
 Comm. coord.   4501 1874.635  722.8 0.3
 Neighbor search410178640.72230322.211.2
 Force  4501   180659.90269658.525.8
 Wait + Comm. F 4501 2578.994  994.4 0.4
 PME mesh   4501   422268.834   162817.760.4
 Write traj.4  10001   17.5266.8 0.0
 Update 4501 2981.794 1149.7 0.4
 Comm. energies 4501 2633.176 1015.3 0.4
 Rest   43580.341 1380.5 0.5
---
 Total  4  699095.342   269556.0   100.0
---

Thanks,
Gaurav
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] Should I use separate PME nodes

2010-06-25 Thread Carsten Kutzner
Hi Gaurav,

separate PME nodes usually pay off on a larger number of nodes
(16). In rare cases, you will see a performance benefit on a small number
of nodes as well. Just try it! Or use g_tune_pme ... ;)

Carsten


On Jun 25, 2010, at 3:32 PM, Gaurav Goel wrote:

 I ran my simulation in parallel on 4 nodes (with zero separate PME
 nodes). Below is the information printed in md.log.
 
 I see that PME-Mesh calculations took 60% of CPU time. Any
 recommendations on using 1 or more separate PME nodes to speed up?
 
 
 Computing: M-Number M-Flops  % Flops
 ---
 Coul(T) + VdW(T) 1761401.496982   119775301.79520.2
 Outer nonbonded loop  106414.135764 1064141.358 0.2
 Calc Weights   32400.006480 1166400.233 0.2
 Spread Q Bspline 2332800.466560 4665600.933 0.8
 Gather F Bspline 2332800.46656027993605.599 4.7
 3D-FFT  47185929.437184   377487435.49763.6
 Solve PME 675840.13516843253768.651 7.3
 NS-Pairs  823453.92765617292532.481 2.9
 Reset In Box2160.0021606480.006 0.0
 CG-CoM  2160.0043206480.013 0.0
 Virial 11700.002340  210600.042 0.0
 Ext.ens. Update10800.002160  583200.117 0.1
 Stop-CM10800.002160  108000.022 0.0
 Calc-Ekin  10800.004320  291600.117 0.0
 ---
 Total 593905146.863   100.0
 ---
 
  R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G
 
 Computing: Nodes Number G-CyclesSeconds %
 ---
 Domain decomp. 4101 3859.416 1488.1 0.6
 Comm. coord.   4501 1874.635  722.8 0.3
 Neighbor search410178640.72230322.211.2
 Force  4501   180659.90269658.525.8
 Wait + Comm. F 4501 2578.994  994.4 0.4
 PME mesh   4501   422268.834   162817.760.4
 Write traj.4  10001   17.5266.8 0.0
 Update 4501 2981.794 1149.7 0.4
 Comm. energies 4501 2633.176 1015.3 0.4
 Rest   43580.341 1380.5 0.5
 ---
 Total  4  699095.342   269556.0   100.0
 ---
 
 Thanks,
 Gaurav
 -- 
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the 
 www interface or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php


--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne




-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php