[gmx-users] HPC performance in Gromacs ?

2017-09-06 Thread Li, J.
Thanks, Pall,

I thought more threads will make the simulation faster. Is it not the case
for Gromacs?

I have tried to change the values. The file is provided here,
https://drive.google.com/a/rug.nl/file/d/0B-VoMAn-UrjYeko4d1VwUURCb00/view?usp=sharing
.

The command line is "gmx_mpi_d mdrun -s PE_pro_50s_npt5.tpr -v -deffnm
PE_pro_50s_npt5 -ntomp 1 -npme 18 -ntomp_pme 1 -nsteps 500 -pin on -dlb yes
-notunepme".

The performance is slower than before before and the imbalance is still
high. I don't know what it means.

Thanks in advance.
Best,
Jing




-- 
Jing Li
PhD student
FMNS, University of Groningen
Room 5115.0309, Nijenborg 4, Groningen
The Netherlands
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] HPC performance in Gromacs ?

2017-09-05 Thread Li, J.
Hello Nikhil,

Thanks for your suggestions. here is the links for two log files. 1 is for
normal set, 2 is for tunning set.

https://drive.google.com/a/rug.nl/file/d/0B-VoMAn-UrjYbEY2dUFvS3pLUWs/view?usp=sharing
https://drive.google.com/a/rug.nl/file/d/0B-VoMAn-UrjYcjctSHBOckRKeEU/view?usp=sharing

Jing

-- 
Jing Li
PhD student
FMNS, University of Groningen
Room 5115.0309, Nijenborg 4, Groningen
The Netherlands
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] HPC performance in Gromacs ?

2017-09-05 Thread Li, J.
Hello all,

I have a report as following, but I do not have any clear clue about what I
have to do. If I change some default values for -dd/-dds/..., what I should
expect and what I should be careful about about tunning these parameters?

It said the load imbalance is from inhomogeneous. But I am not very clear
what inhomogeneous mean here. My system has two kinds of polymer chains and
spatially no vacuum  gap. Why did not these default values suit for my
system?

And I have chosen yes for -dlb option. If not, all the cut-off distance and
PME grid changed for the load balance. As a consequence, the simulation has
been changed, right?



P P - P M E L O A D B A L A N C I N G

NOTE: The PP/PME load balancing was limited by the domain decompostion,
you might not have reached a good load balance.
Try different mdrun -dd settings or lower the -dds value.

PP/PME load balancing changed the cut-off and PME settings:
particle-particle PME
rcoulomb rlist grid spacing 1/beta
initial 1.300 nm 1.300 nm 144 144 144 0.139 nm 0.416 nm
final 1.300 nm 1.300 nm 144 144 144 0.139 nm 0.416 nm
cost-ratio 1.00 1.00
(note that these numbers concern only part of the total PP and PME load)


D O M A I N D E C O M P O S I T I O N S T A T I S T I C S

av. #atoms communicated per step for force: 2 x 38901.0

Average load imbalance: 57.1 %
Part of the total run time spent waiting due to load imbalance: 3.2 %
Steps where the load balancing was limited by -rdd, -rcon and/or -dds: X 2
% Y 0 %
Average PME mesh/force load: 9.597
Part of the total run time spent waiting due to PP/PME imbalance: 58.9 %

NOTE: 58.9 % performance was lost because the PME ranks
had more work to do than the PP ranks.
You might want to increase the number of PME ranks
or increase the cut-off and the grid spacing.
-- 
Jing Li
PhD student
FMNS, University of Groningen
Room 5115.0309, Nijenborg 4, Groningen
The Netherlands
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] gromacs.org_gmx-users Digest, Vol 156, Issue 48

2017-04-13 Thread Li, J.
Hello, Justin, You were correct about the simulation box. I calculated the
simulation box again. I found the volume should be smaller just as your
suggestion. And now it works.

But I had found a question when I tried to find out how the memory was
allocated before I changed the box size. The memory should be
fourier-nx*fourier-ny*fourier-nz*4(for float type) byte, according to the
code. But number of the allocated elements are different between error
exhibition (3,425,307,227) and my calculation (110,592,000,000). The
difference is quite big. Do you know why?

By the way, changing fourierspacing can change the point for integration in
PME. But how pme-order works?

Thanks.

Jing

On Tue, Apr 11, 2017 at 2:44 PM, <
gromacs.org_gmx-users-requ...@maillist.sys.kth.se> wrote:

> Send gromacs.org_gmx-users mailing list submissions to
> gromacs.org_gmx-users@maillist.sys.kth.se
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> or, via email, send a message with subject or body 'help' to
> gromacs.org_gmx-users-requ...@maillist.sys.kth.se
>
> You can reach the person managing the list at
> gromacs.org_gmx-users-ow...@maillist.sys.kth.se
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of gromacs.org_gmx-users digest..."
>
>
> Today's Topics:
>
>1. Lennard-Jones parameters for metal surfaces (Miguel Caro)
>2. Re: Trying to use cutoff electrostatics with MARTINI
>   (Mark Abraham)
>3. Re: Error during nvt: Not enough memory. Failed to allocate
>   3425307227 aligned elements of size 4 for grid->grid. What is
>   happening (Justin Lemkul)
>4. Re: Heavy metals parameterisation (Justin Lemkul)
>5. Re: how to know solvent molecule number (Justin Lemkul)
>6. Re: Heavy metals parameterisation (Alex Mathew)
>
>
> --
>
> Message: 1
> Date: Tue, 11 Apr 2017 14:00:57 +0300
> From: Miguel Caro 
> To: 
> Subject: [gmx-users] Lennard-Jones parameters for metal surfaces
> Message-ID: <3b1609e2-148b-afb9-3b31-799557e61...@aalto.fi>
> Content-Type: text/plain; charset="utf-8"
>
> Hi all,
>
> I want to run some simulations of water/metal interfaces, for which I
> need Lennard-Jones parameters optimized for the metallic atoms in the
> solid phase (e.g., opposite to as cations in solution). I know of the
> difficulty of correctly capturing the metal-water interaction with such
> a simple potential, but my idea was to run a Gromacs simulation with the
> OPLS force field to generate some *reasonable* initial input for ab
> initio MD.
>
> Since the OPLS force field supplied with Gromacs does not contain such
> parameters, I was wondering if someone could point me to relevant
> literature where I can find reliable and tested LJ parameters for
> transition metals. I'm initially testing with Ni but my idea is to run
> simulations for several other transition metals later on.
>
> I could of course parametrize the interactions myself from DFT
> calculations, but it seems a bit overkill at this stage. Any help would
> be much appreciated.
>
> Miguel
>
>
> --
> *Dr. Miguel Caro*
> /Postdoctoral researcher/
> Department of Electrical Engineering and Automation,
> and COMP Centre of Excellence in Computational Nanoscience
> Aalto University, Finland
> Personal email: *mcar...@gmail.com*
> Work: *miguel.c...@aalto.fi*
> Website: http://mcaroba.dyndns.org
>
>
> --
>
> Message: 2
> Date: Tue, 11 Apr 2017 11:03:08 +
> From: Mark Abraham 
> To: gmx-us...@gromacs.org
> Subject: Re: [gmx-users] Trying to use cutoff electrostatics with
> MARTINI
> Message-ID:
> 

[gmx-users] Error during nvt: Not enough memory. Failed to allocate 3425307227 aligned elements of size 4 for grid->grid. What is happening

2017-04-10 Thread Li, J.
This error appeared when I tried to do a NVT simulation. The simulation box
is 768.0 768.0 768.0 [nm**3], in which there are 9 atoms.
The force-field is OPLSAA. I attached the md parameter file for this
simulation. Hope someone who can help me find the problem. Thanks in
advance.

Jing
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.