Re: [gmx-users] Problem regarding Gromacs - GPU
Hi, The benchmarks are infinite runs (nsteps=-1). Try adding the -maxh option to mdrun to limit the execution time. Rossen On 11/23/10 9:48 AM, kapil mathur wrote: Dear All, I have some queries regarding the benchmarking result of mdrun-gpu : 1. I have run mdrun-gpu with dhfr-impl-1nm.bench on tesla-c1060 , can you provide me the details regarding the execution time it takes as in my case it is running from a very long time (4-5 hours) . 2. Similar things are happening for Gromacs-MPI also. I am running it as follows : GPU: mdrun-gpu -device "OpenMM:platform=Cuda,memtest=15,deviceid=0,force-device=no" -s topol.tpr (dhfr-impl-1nm.bench) CPU(Cluster): mpirun -np 4 mdrun -s topol.tpr (dhfr-impl-1nm.bench) Thanking you in advance -- Kapil Mathur HPC Solutions Group C-DAC, Pune Phone: +91-20-25704309 -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
Re: [gmx-users] Problem regarding Gromacs - GPU
Hi, If you take a look at the mdp file, it becomes obvious that the simulation length is infinite: nsteps = -1 This is useful for a benchmarking setup where you want to run e.g. ~10min case in which you'r use the "-maxh 0.167" mdrun option. Cheers, -- Szilárd On Tue, Nov 23, 2010 at 9:48 AM, kapil mathur wrote: > Dear All, > > I have some queries regarding the benchmarking result of mdrun-gpu : > > 1. I have run mdrun-gpu with dhfr-impl-1nm.bench on tesla-c1060 , can you > provide me the details regarding the execution time it takes as in my case > it is running from a very long time (4-5 hours) . > 2. Similar things are happening for Gromacs-MPI also. > I am running it as follows : > > GPU: > mdrun-gpu -device > "OpenMM:platform=Cuda,memtest=15,deviceid=0,force-device=no" -s topol.tpr > (dhfr-impl-1nm.bench) > > CPU(Cluster): > mpirun -np 4 mdrun -s topol.tpr (dhfr-impl-1nm.bench) > > Thanking you in advance > > > > -- > Kapil Mathur > HPC Solutions Group > C-DAC, Pune > Phone: +91-20-25704309 > > -- > gmx-users mailing list gmx-us...@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists > -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
[gmx-users] Problem regarding Gromacs - GPU
Dear All, I have some queries regarding the benchmarking result of mdrun-gpu : 1. I have run mdrun-gpu with dhfr-impl-1nm.bench on tesla-c1060 , can you provide me the details regarding the execution time it takes as in my case it is running from a very long time (4-5 hours) . 2. Similar things are happening for Gromacs-MPI also. I am running it as follows : GPU: mdrun-gpu -device "OpenMM:platform=Cuda,memtest=15,deviceid=0,force-device=no" -s topol.tpr (dhfr-impl-1nm.bench) CPU(Cluster): mpirun -np 4 mdrun -s topol.tpr (dhfr-impl-1nm.bench) Thanking you in advance -- Kapil Mathur HPC Solutions Group C-DAC, Pune Phone: +91-20-25704309 -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists