Dear all, i just got new hardware, and ran a couple of tests, comparing performance of the new machine to results from another five years old computer. I found the outcome (see below) somewhat disappointing, and write to see if other people got similar results, or if perhaps i overlooked something ...
the test: a system of polymers, ~18000 atoms, MD at ambient conditions mdp file included below, both computers run linux, the old one ubuntu-12.04 (cuda-5.0), the new one debian-testing (cuda-5.5), both use gromacs 5.0. old machine: CPU: Intel(R) Core(TM) i7 CPU 960 @ 3.20GHz, 8 cores GPU:GeForce GTX 460 Performance: 31.717 ns/day new machine: CPU: Intel(R) Core(TM) i7-4930K CPU @ 3.40GHz, 12 cores GPU:GeForce GTX 780 Performance: 64.072 ns/day the load balance (both with or without dynamic load balancing) is worse with the new machine as the GPU is considerbly faster than the CPU, but this still only accounts for something like 20-30% of the overall speed difference so the new machine appears to be only about twice as fast as the old one ... is this to be expected? or is there anything i can do to improve its performance?? thanks for any hints/ideas! michael mdp file: integrator = md dt = 0.002 nsteps = 10000 comm-grps = System ; nstxout = 1000 nstvout = 0 nstfout = 0 nstlog = 1000 nstenergy = 1000 ; nstlist = 20 ns_type = grid pbc = xyz rlist = 1.1 cutoff-scheme = Verlet ; coulombtype = PME rcoulomb = 0.9 vdw_type = cut-off rvdw = 0.9 DispCorr = EnerPres ; tcoupl = Berendsen tc-grps = System tau_t = 0.2 ref_t = 298.0 ; gen-vel = yes gen-temp = 240.0 gen-seed = -1 continuation = no ; Pcoupl = berendsen Pcoupltype = isotropic tau_p = 0.5 compressibility = 1.0e-5 ref_p = 1.0 ; constraints = hbonds -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.