Re: [gmx-users] GMX 2020 - COMM Removal Issue

2020-03-06 Thread Daniel Kozuch
might be that you where affected by this before. Please let us know if the issue shows up again. Cheers Paul On 06/03/2020 18:58, Daniel Kozuch wrote: > Additional (good) news, the problem appears to be resolved in the 2020.1 > update (at least for the membrane only system). I'll c

Re: [gmx-users] GMX 2020 - COMM Removal Issue

2020-03-06 Thread Daniel Kozuch
On Behalf Of Justin Lemkul Sent: Friday, March 6, 2020 11:02 AM To: gmx-us...@gromacs.org Subject: Re: [gmx-users] GMX 2020 - COMM Removal Issue On 3/6/20 10:00 AM, Daniel Kozuch wrote: > [Somehow my response got put in a different thread - hopefully this > works] > > Justin, > >

Re: [gmx-users] GMX 2020 - COMM Removal Issue

2020-03-06 Thread Daniel Kozuch
] GMX 2020 - COMM Removal Issue On 3/6/20 10:00 AM, Daniel Kozuch wrote: > [Somehow my response got put in a different thread - hopefully this > works] > > Justin, > > Thanks for your reply. I agree that some COM motion is normal. > However, this was a very short simulation (

Re: [gmx-users] GMX 2020 - COMM Removal Issue

2020-03-06 Thread Daniel Kozuch
Sent: Tuesday, March 3, 2020 3:02 PM To: gmx-us...@gromacs.org Subject: Re: [gmx-users] GMX 2020 - COMM Removal Issue On 3/2/20 9:53 PM, Daniel Kozuch wrote: > Hello, > > I am experimenting with GROMACS 2020. I have compiled the mpi threaded > version and am using the new settings

Re: [gmx-users] GMX 2020 - COMM Removal Issue

2020-03-05 Thread Daniel Kozuch
don't see the same drift. Best, Dan On Tue, Mar 3, 2020 at 3:03 PM Justin Lemkul wrote: > > > On 3/2/20 9:53 PM, Daniel Kozuch wrote: > > Hello, > > > > I am experimenting with GROMACS 2020. I have compiled the mpi threaded > > version and am using

[gmx-users] GMX 2020 - COMM Removal Issue

2020-03-02 Thread Daniel Kozuch
Hello, I am experimenting with GROMACS 2020. I have compiled the mpi threaded version and am using the new settings (GMX_GPU_DD_COMMS, GMX_GPU_PME_PP_COMMS, GMX_FORCE_UPDATE_DEFAULT_GPU) as suggested on at the following link: https://devblogs.nvidia.com/creating-faster-molecular-dynamics-simulatio

Re: [gmx-users] Lambda Weights from Expanded Ensemble Code

2020-01-16 Thread Daniel Kozuch
n, but clearly they aren't. If you file a redmine > issue, I may be able to take a look, but it might take a while to address. > > On Wed, Jan 15, 2020 at 8:52 PM Daniel Kozuch > wrote: > > > Hello, > > > > I am interested in using simulated tempering in GROMAC

[gmx-users] Lambda Weights from Expanded Ensemble Code

2020-01-15 Thread Daniel Kozuch
Hello, I am interested in using simulated tempering in GROMACS (2019.5) under the expanded ensemble options. Is there a way to monitor the ensemble weights as the simulation progresses? I think in theory they are supposed to be printed out in the log file, but it is only printing 0, -nan, and inf:

[gmx-users] Incompatible subsytems for REMD when running on multiple nodes

2019-11-17 Thread Daniel Kozuch
Hello, I am running GROMACS 2019.4 (with GPUs) using the following command on two nodes (each with 28 processors, and 4 GPUs). srun -n 56 gmx mdrun -s sim -cpi sim -append no -deffnm sim -plumed plumed.dat -multidir $mydirs -replex 500 -ntomp 1 It starts fine, but when I restart I get the incomp

[gmx-users] OPLS AA/M pdb2gmx error with inter

2019-09-23 Thread Daniel Kozuch
I am having a problem similar to that mentioned in a previous thread ( https://www.mail-archive.com/gromacs.org_gmx-users@maillist.sys.kth.se/msg35369.html), but I could not find a solution from that discussion. I am using pdb2gmx with the flag -inter and the OPLS AA/M force field: > gmx pdb2gmx -

Re: [gmx-users] tune_pme error with GROMACS 2018

2018-02-12 Thread Daniel Kozuch
at 8:32 AM, Kutzner, Carsten wrote: > Hi Dan, > > > On 11. Feb 2018, at 20:13, Daniel Kozuch wrote: > > > > Hello, > > > > I was recently trying to use the tune_pme tool with GROMACS 2018 with the > > following command: > > > > gmx tune_pme -n

[gmx-users] tune_pme error with GROMACS 2018

2018-02-11 Thread Daniel Kozuch
Hello, I was recently trying to use the tune_pme tool with GROMACS 2018 with the following command: gmx tune_pme -np 84 -s my_tpr.tpr -mdrun 'gmx mdrun' but I'm getting the following error: "Fatal error: Cannot execute mdrun. Please check benchtest.log for problems!" Unfortunately benchtest.lo

Re: [gmx-users] Gromacs 2018 and GPU PME

2018-02-09 Thread Daniel Kozuch
Szilárd, If I may jump in on this conversation, I am having the reverse problem (which I assume others may encounter also) where I am attempting a large REMD run (84 replicas) and I have access to say 12 GPUs and 84 CPUs. Basically I have less GPUs than simulations. Is there a logical approach to

[gmx-users] Volume-Temperature Replica Exchange

2017-12-21 Thread Daniel Kozuch
Hello, I am performing constant pressure replica exchange across a phase transition, and as one might expect the associated change in volume is causing exchange issues and many of my replicas are not efficiently crossing the phase transition. I noticed some papers that claim volume-temperature re

[gmx-users] Dynamic Load Balancing Crash

2017-11-05 Thread Daniel Kozuch
Hello, I recently started experiencing a error with GROMACS 2016.3 during a replica exchange simulation with 80 replicas, 480 cpus, and 40 GPUs: Assertion failed: Condition: comm->cycl_n[ddCyclStep] > 0 When we turned on DLB, we should have measured cycles The simulation then crashes. I turned o

[gmx-users] Free Energy Calculations with Position Restraints

2017-09-11 Thread Daniel Kozuch
Hello all, Is there a way to use the free energy code with position restraints (similar to the way that the free energy code interacts with the pull code)? From the manual all I can see that might be relevant is "restraint-lambdas" but that is apparently only for "dihedral restraints, and the pull

[gmx-users] Using the Pull Code to Restrain an Ice Layer

2017-09-10 Thread Daniel Kozuch
Hello, I am attempting to restrain an ice layer in a system with liquid water. I initially considered using position restraints, but it seems like GROMACS has a few quirks that make that difficult: you have to create a new .itp and define the crystal water as different from the liquid water, then

[gmx-users] Using the Pull Code to Restrain Ice

2017-09-10 Thread Daniel Kozuch
Hello, I am attempting to restrain an ice sheet in a system with liquid water. I initially considered using position restraints, but it seems like GROMACS has a few quirks that make that difficult: you have to create a new .itp and define the crystal water as different from the liquid water and th

Re: [gmx-users] Pull Code: direction-periodic with 3 dimensions

2017-08-11 Thread Daniel Kozuch
Thanks for the quick reply, I was worried that was the case. Best, Dan On Fri, Aug 11, 2017 at 5:05 PM, Justin Lemkul wrote: > > > On 8/11/17 5:02 PM, Daniel Kozuch wrote: > > Hello, > > > > I am using a pull code to increase the end-to-end distance of a protein

[gmx-users] Pull Code: direction-periodic with 3 dimensions

2017-08-11 Thread Daniel Kozuch
Hello, I am using a pull code to increase the end-to-end distance of a protein (included below). I am using direction-periodic and would like the distance between the COM groups to be calculated in three dimensions. However, setting pull_coord1_dim = Y Y Y appears to have no effect and the distanc

[gmx-users] Umbrella Sampling with Direction-Periodic

2017-08-01 Thread Daniel Kozuch
Hello, I am using a pull code with geometry=direction-periodic and attempting to use gmx wham to construct the free energy. The pulling code is doing what I would like it to, but as might be expected from direction-periodic, when the pull distance is more than half the box length the distance is w

[gmx-users] Umbrella Sampling with Direction-Periodic Pulling

2017-07-22 Thread Daniel Kozuch
Hello, I am using a pull code with geometry=direction-periodic and attempting to use gmx wham to construct the free energy. I believe the pulling code is doing what I would like it to, but as might be expected from direction-periodic, when the pull distance is more than half the box length the dis

Re: [gmx-users] Non-periodic COM Pulling

2017-07-13 Thread Daniel Kozuch
7/12/17 3:05 PM, Daniel Kozuch wrote: > > Hello, > > > > Is it possible to do non-periodic COM pulling using the distance function > > in GMX 5.14 (i.e. where the distance between the two groups is calculated > > ignoring pbc)? > > > > No, but this is what

[gmx-users] Non-periodic COM Pulling

2017-07-12 Thread Daniel Kozuch
Hello, Is it possible to do non-periodic COM pulling using the distance function in GMX 5.14 (i.e. where the distance between the two groups is calculated ignoring pbc)? In the tutorials/online the solution seems to be to simply use a box twice the size of the largest pulling distance, but that w

[gmx-users] Legacy Test Failure with GMX 5.1.4

2017-06-23 Thread Daniel Kozuch
Hello, I am recompiling GROMACS on a new compute node and I am getting a unit test failure (shown below). I am compiling with GNU 4.8.5 and the following cmake commands: cmake .. -DCMAKE_INSTALL_PREFIX=[redacted] -DGMX_MPI=on -DGMX_GPU=off -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX

[gmx-users] GMX GPU Rest Time

2017-06-07 Thread Daniel Kozuch
Hello, I recently changed the number of cpus I was pairing with each gpu and I noticed a significant slowdown, more than I would have expected simply due to a reduction in the number of cpus. >From the log file it appears that the GPU is resting for a large amount of time. Is there something I ca

Re: [gmx-users] em and nvt problem

2017-05-25 Thread Daniel Kozuch
It would be helpful if you include the output of the em run and the log file for the nvt run. Best, Dan On Thu, May 25, 2017 at 7:12 AM, Kashif wrote: > Hi > Whenever I tried to simulate one of my docked complex, the energy > minimization step converged very fast and complete at 112 steps. And

Re: [gmx-users] Poor GPU Performance with GROMACS 5.1.4

2017-05-25 Thread Daniel Kozuch
Hi Marcelo, That sounds reasonable depending on your time-step and other factors, but I have not attempted to run with more than one job for GPU. Maybe Mark can comment more. Best, Dan On Thu, May 25, 2017 at 8:09 AM, Marcelo Depólo wrote: > Hi, > > > I had the same struggle benchmarking a sim

Re: [gmx-users] Poor GPU Performance with GROMACS 5.1.4

2017-05-24 Thread Daniel Kozuch
, May 24, 2017 at 9:48 PM, Daniel Kozuch wrote: > Szilárd, > > I think I must be misunderstanding your advice. If I remove the domain > decomposition and set pin on as suggested by Mark, using: > > gmx_gpu mdrun -deffnm my_tpr -dd 1 -pin on > > Then I get very poor perfo

Re: [gmx-users] Poor GPU Performance with GROMACS 5.1.4

2017-05-24 Thread Daniel Kozuch
Szilárd, I think I must be misunderstanding your advice. If I remove the domain decomposition and set pin on as suggested by Mark, using: gmx_gpu mdrun -deffnm my_tpr -dd 1 -pin on Then I get very poor performance and the following error: NOTE: Affinity setting for 6/6 threads failed. This can

Re: [gmx-users] Poor GPU Performance with GROMACS 5.1.4

2017-05-24 Thread Daniel Kozuch
toral Research Associate > University of Tennessee/Oak Ridge National Laboratory > Center for Molecular Biophysics > > > From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se < > gromacs.org_gmx-users-boun...@maillist.sys.kth.se> on be

[gmx-users] Poor GPU Performance with GROMACS 5.1.4

2017-05-24 Thread Daniel Kozuch
Hello, I'm using GROMACS 5.1.4 on 8 CPUs and 1 GPU for a system of ~8000 atoms in a dodecahedron box, and I'm having trouble getting good performance out of the GPU. Specifically it appears that there is significant performance loss to wait times ("Wait + Comm. F" and "Wait GPU nonlocal"). I have