Szilard
You are correct; I misspoke. It appears that the GPU is doing *something*,
but I it would seem it is severely under performing (near the point where
its basically doing nothing).
the entire log, not just parts, in particular we want to see the
> header, perf table
>
xyz@Turing:~/Desktop/
Hi Suniba,
assuming that # 5 is GROMACS, then # 2-4 are the required dependencies to
have GROMACS running, # 8 is parallel and # 9 is Y.
all the other questions refer to the allocation size and the best answer is
usually bigger is better. In practice, the largest possible size of the
allocation d
Hi,
If mdrun claims it's using the GPU and you see no errors (and results
look reasonable), the GPU is likely being used. That nvidia-smi is not
showing it is admittedly strange. However, this truncated log output
does not confirm much, e.g. it does not show the performance table and
the final per
Hi,
I decided to test this for a system of sodium and chloride ions (based on
http://www.gromacs.org/@api/deki/files/94/=gromacs_nb.pdf ):
Guyana Rwanda Oman Macau Angola Cameroon Senegal
8
1NA Na1 0.131 0.116 0.051
2NA Na2 0.137 0.111 0.047
3NA
Hi,
I am creating a topology for a molecule that is not defined by OPLS. I have
some questions that I want to ask:
1. What is bondtype and ptype in ffnonbonded.itp?
2. If I use the Buckingham potential to define nonbonded interactions
between atom A and atom B, does this potential apply to all A
I'm currently running a simple 1ns calculation with 500,000 steps with dt =
2fs and structures being stored every 10ps.
On Thu, Jul 14, 2016 at 11:31 AM, Smith, Micholas D.
wrote:
> Quick question, what is your integration stepsize?
>
> ===
> Micholas Dean Smith, PhD.
> Post-doct
Quick question, what is your integration stepsize?
===
Micholas Dean Smith, PhD.
Post-doctoral Research Associate
University of Tennessee/Oak Ridge National Laboratory
Center for Molecular Biophysics
From: gromacs.org_gmx-users-boun...@mai
Hello
After building GROMACS with GPU support and running some simple MD, I am
only getting ~20ns/day (which is what I would expect to get out of my
i7-6700k cpu only. My machine currently has a 1080 GTX in it, which should
even on moderately large systems, be getting ~100ns/day)
I've tried laun
Hello Users and experts
We are going to create an account for a supercomputing facility. They have
asked to fill a form to initiate the process. I am confused about certain terms
as I belong to biology background. Please help me with these specific terms and
what should I fill in to get good pe
On 7/14/16 12:40 PM, Mohsen Ramezanpour wrote:
Thanks Joao and Justin for your comments,
As Justin mentioned, I followed the original articles on this force field.
However, I was not sure if the same parameters can be transferred to
Gromacs or not.
The cutoffs are straightforward to get rig
Few cents on this.
On our cluster with well configured OpenMPI 1.8.8, 1.10.2 and 1.10.3 (and
2.0.0 in testing)
gromacs-5.1.2 runs smoothly with the above mentioned openmpi versions using
the same prace benchmarks case (CaseB).
Performance with these openmpi versions (for a 6 minutes maxh, whatever
Any other comment on this? :-)
On Wed, Jul 13, 2016 at 9:50 AM, Mohsen Ramezanpour <
ramezanpour.moh...@gmail.com> wrote:
> Hi Mark,
>
> Thanks for your reply.
>
> The 6 around 1 setup is also periodic.
>
> I understand this. However, we can argue the same for 6-1 system as well.
> Right?
>
> If
Thanks Joao and Justin for your comments,
As Justin mentioned, I followed the original articles on this force field.
However, I was not sure if the same parameters can be transferred to
Gromacs or not.
The only thing different was the initial temperature of 50K and increasing
it to 310 K through
Hello again,
I've talked to my adviser and we agreed to install version 5.1. I'll take a
look in the documentation and try to work on it.
Murilo.
2016-07-13 10:19 GMT-06:00 MURILO GABARDO KRAMAR <
murilokra...@alunos.utfpr.edu.br>:
> Hello Mark,
>
> Thanks for you answer,
>
> My concern about u
Hi,
Any MPI-1.1 conformant implementation is great. The only time GROMACS has
problems is when the MPI library does... OpenMPI 1.8.6 leaks memory. 1.8.10
we use locally.
Mark
On Thu, Jul 14, 2016 at 12:45 PM Adam Huffman
wrote:
> Hi Mark,
>
> Hmm. Other applications have been working with Open
Hi Mark,
Hmm. Other applications have been working with OpenMPI, but that
doesn't invalidate your point.
Is there a particular implementation you recommend for GROMACS?
Cheers,
Adam
On Thu, Jul 14, 2016 at 11:00 AM, Mark Abraham wrote:
> Hi,
>
> Looks like your MPI setup is broken. So far, I
Thanks a lot Florent. Your suggestion is most helpful
On 14 July 2016 at 14:19, Florent Hédin wrote:
> Hi,
>
> check
>
> http://manual.gromacs.org/online/xtc.html
>
> and
>
> http://www.gromacs.org/Developer_Zone/Programming_Guide/XTC_Library
>
> They provide standalone library to compile, and e
On 7/14/16 4:37 AM, João Henriques wrote:
I was wondering if I should modify any terms because of using Gromacs
package?
It's good practice to set your key parameters as close as possible to
theirs. I am not familiar with Desmond, but I assume many of its parameters
have similar counterparts
On 7/14/16 1:48 AM, Ali Mohyeddin wrote:
Dear all,
I am going to calculate the temperature of single water molecule around a
protein by writing a simple script to calculate the temperature from
what "trjconv"
command prints out:
mass vx vy vz
2SOL OW 16 0.2789 -0.4959 0.8134
2SOL HW1 1 1.09
On 7/14/16 6:27 AM, amitbe...@chemeng.iisc.ernet.in wrote:
Hello Users,
I was wandering how can we calculate the electron density with the help of
gmx density. how to write the " -ei [<.dat>] (electrons.dat) "
There is an example in the help information.
-Justin
--
Hello Users,
I was wandering how can we calculate the electron density with the help of
gmx density. how to write the " -ei [<.dat>] (electrons.dat) "
Regards,
Amit Behera
--
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.
--
Gromacs
Hi,
Looks like your MPI setup is broken. So far, I have only heard of people
having problems when using OpenMPI 1.10.x, even the latest patch release.
Mark
On Thu, Jul 14, 2016 at 11:52 AM Adam Huffman
wrote:
> Hello
>
> I've been trying to run the PRACE benchmarks on a new cluster:
>
> http:/
Hello
I've been trying to run the PRACE benchmarks on a new cluster:
http://www.prace-ri.eu/ueabs/#GROMACS
and this is the command-line I've been running:
mpirun --mca plm_base_verbose 10 --debug-daemons gmx_mpi mdrun -s
ion_channel.tpr -maxh 0.50 -resethway -noconfout -nsteps 1 -g
logfile-
Hi,
check
http://manual.gromacs.org/online/xtc.html
and
http://www.gromacs.org/Developer_Zone/Programming_Guide/XTC_Library
They provide standalone library to compile, and examples for C and
Fortran. But I never tried it and apparently it was last updated 2 years
ago, I hope it can still re
Hi Mark,
thanks for the elaborate answer.
If [pairs] are considered bonded interactions it's perfectly clear.
Peter
On 13/07/16 17:24, Mark Abraham wrote:
> Hi,
>
> Maybe I missed the point earlier, but as e.g. 5.4.4 Exclusions section of
> PDF reference manual says:
>
> "Extra exclusions within
> I was wondering if I should modify any terms because of using Gromacs
package?
It's good practice to set your key parameters as close as possible to
theirs. I am not familiar with Desmond, but I assume many of its parameters
have similar counterparts in Gromacs. If you plan on (significantly)
mo
26 matches
Mail list logo