I filed a redmine on this topic.
http://redmine.gromacs.org/issues/1486
Thank you for noticing this Andrea. Your suggested fix will work well for the
runs that I already have and this redmine post will hopefully lead to better
log file output in the future.
Chris.
__
On 4/19/14, 1:51 PM, Andrew DeYoung wrote:
Hi,
From the web page
http://www.gromacs.org/About_Gromacs/Release_Notes/Versions_4.6.x ,
it looks like parallelization for g_hbond (via OpenMP) was added in version
4.6.x.
What is a typical command to run g_hbond in parallel? Since I mostly have
On 4/19/14, 11:06 AM, sujithkakkat . wrote:
Hello Justin,
I found that in the virtual sites section of your tutorial, the energy
values of the system large positive . You had warned in the topology file
that the choice of the partial charges used are not guaranteed to work
always with OPLS
No
On Apr 19, 2014 8:17 PM, "Tom" wrote:
> Dear Gromacs Developers
>
> Can some gromacs new version do Grand Canonical Monte Carlo (GCMC)
> simulations?
>
> Thanks!
>
> Thomas
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-U
Dear Gromacs Developers
Can some gromacs new version do Grand Canonical Monte Carlo (GCMC)
simulations?
Thanks!
Thomas
--
Gromacs Users mailing list
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromac
Dear Gromacs Users,
I am simulating a polymer membrane, which has two boxes of water on both
sides.
Polymer membrane position is shifting a lot even though I used comm_mode =
linear
Is there a way to fix the center of the mass of polymer membrane?
Thanks a lot for the suggestions!
Thom
--
Gro
Hi,
>From the web page
http://www.gromacs.org/About_Gromacs/Release_Notes/Versions_4.6.x ,
it looks like parallelization for g_hbond (via OpenMP) was added in version
4.6.x.
What is a typical command to run g_hbond in parallel? Since I mostly have
experience with Gromacs 4.5.5 (which for mdru
Dear gromacs users,
I have been working on gromacs package for performing md simulations.
recently i have started using cluster with 2 nodes with 64 cores each.
I am eager to know whether a job run on 32 cores for few ns and later i
continue the same job on 64 cores (or vice versa), will show any
Hello Justin,
I found that in the virtual sites section of your tutorial, the energy
values of the system large positive . You had warned in the topology file
that the choice of the partial charges used are not guaranteed to work
always with OPLSAA . However, the momentum of inertial values poi
Mean of the three calculations is 4.35, standard error is 0.19. So this
seems statistically consistent. If you want to get more consistent
results, you'll need to run longer until the estimated statistical
uncertainty is lower. Also, the estimated uncertainty is often an
underestimate (BAR often
On 4/19/14, 1:16 AM, sujithkakkat . wrote:
Hello Justin,
I tried the free energy calculations in your tutorial at different sets
of conditions. Also I repeated a simulation at the same conditions with
same parameters, thrice on the same computer on same number of processors.
The results in
11 matches
Mail list logo