Sorry for the previous mistake. Instead of 53a7, the force field which I
used was 53a6.
On Fri, Nov 8, 2013 at 12:10 AM, Justin Lemkul [via GROMACS] <
ml-node+s5086n5012325...@n6.nabble.com> wrote:
>
>
> On 11/7/13 12:14 PM, pratibha wrote:
> > My protein contains metal ions which are parameteri
Hello users
For making a rerun of a MD simulation done with PME, but this time using
Reaction field zero, do I eliminate the ions from the system, previously
added to neutralize and ran with PME? What do you think about using a
twin-range cut off instead of reaction field zero? I need to obtain
pre
Greetings again
If I use a salt concentration for neutralizing the protein-ligand complex
and run MD using PME, and the ligand is neutral, do I perform ligand MD
simulation without adding any salt concentration? It could be relevant for
LIE free energy calculation if I don't include salt in ligand
Greetings
The discussion list had helped me about understanding what to do when I
want to calculate binding free energy using LIE after doing MD simulation
using PME.
Now I need your help about choosing the simulation box size for ligand and
complex. I used -d 1.0 in editconf for the complex simula
Dear Kieu Thu
Thanks for your comment about free energy. Unfortunately, I could not send a
email to Paissoni Cristina in the Gromacs Forum.
Could you give me email address of Paissoni Cristina? Finding a tool for
calculation MM/PBSA with Gromacs is very vital for me.
Best Regards
Kiana
--
View
Dear Riccardo Concu,
The reason is that I used the make_ndx file in the working directory.
Now, I solved that problem with installing gromacs as following
cmake .. -DGMX_BUILD_OWN_FFTW=Off
Anyway, I appreciate your comment.
Best regards,
Changwoon Jang
On Friday, November 8, 2013
The only open source Monte Carlo package I have every used or now of is
MCCS Towhee:
http://towhee.sourceforge.net/
Andrew
On 11/08/2013 11:23 AM, Andrew DeYoung wrote:
Hi,
I have been using Gromacs for my MD liquid simulations for about 2 years
now, and, of course, it has been working great
Hi,
I have been using Gromacs for my MD liquid simulations for about 2 years
now, and, of course, it has been working great!
Now, in the current part of my project, I am looking at liquid structure,
not dynamics. Of course, one can analyze liquid structure by simply
analyzing an MD trajectory.
OK, thanks.
Please open a new issue at redmine.gromacs.org, describe your observations
as above, and upload a tarball of your input files.
Mark
On Fri, Nov 8, 2013 at 2:14 PM, Qin Qiao wrote:
> On Fri, Nov 8, 2013 at 7:18 PM, Mark Abraham >wrote:
>
> > Hi,
> >
> > That shouldn't happen if yo
On 11/8/13 7:44 AM, vidhya sankar wrote:
Dear Justin Thank you for your Previous reply,
I am trying to Install gromacs 4.6.2 in a cluster having centos
OS with teh Following Command I got following error
cmake ..
-DGMX_BUILD_OWN_FFTW=ON -DGMX_MPI=ON
-DGMX_DOUBLE=ON
CMake Warning at
On 11/7/13 11:32 PM, Rajat Desikan wrote:
Dear All,
The setting that I mentioned above are from Klauda et al., for a POPE
membrane system. They can be found in charmm_npt.mdp in lipidbook (link
below)
http://lipidbook.bioch.ox.ac.uk/package/show/id/48.html
Is there any reason not to use their
On Fri, Nov 8, 2013 at 7:18 PM, Mark Abraham wrote:
> Hi,
>
> That shouldn't happen if your MPI library is working (have you tested it
> with other programs?) and configured properly. It's possible this is a
> known bug, so please let us know if you can reproduce it in the latest
> releases.
>
> M
Dear Justin Thank you for your Previous reply,
I am trying to Install gromacs 4.6.2 in a cluster having centos
OS with teh Following Command I got following error
cmake ..
-DGMX_BUILD_OWN_FFTW=ON -DGMX_MPI=ON
-DGMX_DOUBLE=ON
CMake Warning at CMakeLists.txt:
Ok, you convinced me. I'll really give it a try. I'm following this throw
the suposition that openMP implementation on intel compilers is not as good
as GNU compilers. I already tested intel openMP in our cluster, and it just
sucked in comparison to the pure MPI compilation.
Let's hope I can make
On Fri, Nov 8, 2013 at 12:02 AM, Jones de Andrade wrote:
> Really?
Of course. With openmp gets to use all your cores for PME+bondeds+stuff
while the GPU does PP. Any version without openmp gets to use one core per
domain, which is bad.
> An what about gcc+mpi? should I expect any improvement?
Hi,
That shouldn't happen if your MPI library is working (have you tested it
with other programs?) and configured properly. It's possible this is a
known bug, so please let us know if you can reproduce it in the latest
releases.
Mark
On Fri, Nov 8, 2013 at 6:55 AM, Qin Qiao wrote:
> Dear all,
Hello again,
That depends on the peptide. There is no general answer. I am starting with
a linear conformations, but that's because I'm working with intrinsically
disordered proteins. That's as far as I can go regarding telling you about
what I'm doing. I'm not at liberty to discuss these things,
Should I start with helical peptides and see if it maintains the helicity
or I can start with random coil?
Do random coil peptides take long simulation time to form helical peptides?
any help on this will be appreciated.
On Tue, Nov 5, 2013 at 12:25 AM, Archana Sonawani-Jagtap <
ask.arch...@gma
Hi, this feature is really really experimental and should indeed be
avoided. If opls you want, then give a try with
http://www.aribeiro.net.br/mktop/
Alan
On 8 November 2013 04:42, aditya sarma wrote:
> Hi,
> i was trying to generate topology for p-phenylene vinylene polymer for OPLS
> forcefi
Dear Changwoon,
why are you using ./make_ndx instead of make_ndx? Did you try to use the
command without ./?
Regards
El jue, 07-11-2013 a las 14:12 -0800, Chang Woon Jang escribió:
> Dear Users,
>
> I am using openSUSE 12.3 and try to use make_ndx and g_angle. When I try
> the following com
20 matches
Mail list logo