[gmx-users] Re: topology does not match

2010-03-02 Thread Justin A. Lemkul


Please keep all Gromacs-related correspondence on the gmx-users list.  I am not 
a private help service.  Your question has been asked and answered dozens, if 
not hundreds, of times and you should be able to find a solution in the list 
archive or on the Errors page:


http://www.gromacs.org/Documentation/Errors#Number_of_coordinates_in_coordinate_file_does_not_match_topology

-Justin

lena farnandis wrote:

Dear sir,
 
 suffer from same one  problem . when i was run drug enzyme MD. so how i 
can solve this problem.
 
gromacs error as follows
 
Fatal error:

number of coordinates in coordinate file (trp.pdb, 169092)
 does not match topology (topology.top, 169071)
 


--


Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin


--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


[gmx-users] parallel

2010-03-02 Thread Gavin Melaugh
Hi gmx-users


I have been using gromacs now for a few months and have ran several
simulations in serial, which has been quit good. I now am trying to run
the simulations in parallel but seem to be having one or two little
problems. It seems that the computation does not run on more than two
CPUs. It runs using one and two CPUs. Does anyone have any idea why this
might be?

Cheers

Gavin
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] parallel

2010-03-02 Thread Mark Abraham

On 2/03/2010 11:27 PM, Gavin Melaugh wrote:

Hi gmx-users


I have been using gromacs now for a few months and have ran several
simulations in serial, which has been quit good. I now am trying to run
the simulations in parallel but seem to be having one or two little
problems. It seems that the computation does not run on more than two
CPUs. It runs using one and two CPUs. Does anyone have any idea why this
might be?


No. does not run doesn't have any useful diagnostic value for us, I'm 
afraid.


Mark
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


[gmx-users] g_nmeig_d error: can't allocate region

2010-03-02 Thread sarbani chattopadhyay
Hi,
  I am trying to do normal mode analysis on a protein having 6398 atoms in 
vaccum.

I tried to energy minimize the structure using steepest descent, followed by 
l-bfgs 
minimization. the .mdp file I used is

define   = -DFLEXIBLE
constraints  = none
integrator   = l-bfgs
tinit= 0
nsteps   = 15000
nbfgscorr= 50
emtol= .001
emstep   = 0.1
gen_vel  = yes
gen-temp = 300
nstcomm  =  1
; NEIGHBORSEARCHING PARAMETERS
; nblist update frequency
nstlist  = 0
; ns algorithm (simple or grid)
ns-type  = simple
; Periodic boundary conditions: xyz (default), no (vacuum)
; or full (infinite systems only)
pbc  = no
; nblist cut-off
rlist= 0
domain-decomposition = no
; OPTIONS FOR ELECTROSTATICS AND VDW
; Method for doing electrostatics
coulombtype  = Cut-Off
rcoulomb-switch  = 0
rcoulomb = 0
; Dielectric constant (DC) for cut-off or DC of reaction field
epsilon-r= 1
; Method for doing Van der Waals
vdw-type = Cut-off
; cut-off lengths
rvdw-switch  = 0
rvdw = 0


after running the 15000 steps the Fmax was:
Low-Memory BFGS Minimizer converged to Fmax  0.001 in 10197 steps
Potential Energy  = -8.26391832320506e+04
Maximum force =  9.37558560558845e-04 on atom 4562
Norm of force =  2.24887722104890e-04


Again the l-bfgs minimization was run using the same .mdp file( with emtol = 
0.01)

the output was'
Low-Memory BFGS Minimizer converged to Fmax  1e-06 in 4143 steps
Potential Energy  = -8.26391832324998e+04
Maximum force =  9.67927896882578e-07 on atom 3271
Norm of force =  1.70637151528245e-07



After this I prepared the nm.mdp  file for NMA, where I used exactly the same 
parameters 
as the ones used in lbfgs energy minimization( with integrator = nm)

the commands that were used were:
grompp_d -f new_nm.mdp -t new_lbfgs_2.trr -c new_lbfgs_2.gro -o new_nm.tpr 
-zero -p 
../topol.top

nohup mdrun_d -v -s new_nm.tpr -deffnm new_nm -mtx new_nm.mtx 


nohup.out had the following message:
Non-cutoff electrostatics used, forcing full Hessian format.Allocating Hessian 
memory...starting normal mode calculation 'Protein'6398 steps.Maximum force: 
9.67928e-
07


The run ended successfully:

Then i used the command 
g_nmeig_d -f new_nm.mtx -s new_nm.tpr -ol eigenvalue.xvg -v eigenvector.trr

I get the following error:
Reading file new_nm.tpr, VERSION 4.0.7 (double precision)
Reading file new_nm.tpr, VERSION 4.0.7 (double precision)
Reading double precision matrix generated by Gromacs VERSION 4.0.7
Full matrix storage format, nrow=19194, ncols=19194

Diagonalizing to find vectors 1 through 50...
g_nmeig_d(1892) malloc: *** mmap(size=18446744072353271808) failed (error 
code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug

I am not being able to understand the problem. the computer  has a 16gb memory



If I use different parameters in the nm.mdp file as

rlist= 1.5
domain-decomposition = no
; OPTIONS FOR ELECTROSTATICS AND VDW
; Method for doing electrostatics
coulombtype  = switch
rcoulomb-switch  = 1
rcoulomb = 1.2
; Dielectric constant (DC) for cut-off or DC of reaction field
epsilon-r= 1
; Method for doing Van der Waals
vdw-type = switch
; cut-off lengths
rvdw-switch  = 1
rvdw = 1.2


then i get the message :Maximum force: 3.14171e+03
Maximum force probably not small enough to ensure that you are in an
energy well. Be aware that negative eigenvalues may occur when the
resulting matrix is diagonalized.


I am sorry to post such a lengthy  query, but I have no clue about the root of 
the problem.
Any suggestion  will be of great help.
Thanks in advance,
Sarbani. -- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

RE: [gmx-users] g_nmeig_d error: can't allocate region

2010-03-02 Thread Berk Hess

Hi,

The allocation that causes the error should allocate about 3 GB.
My guess is that you compiled Gromacs in 32bit mode, whereas 64bit mode
is required for allocation of more than 2 GB.

Berk

Date: Tue, 2 Mar 2010 13:35:58 +
To: gmx-users@gromacs.org
From: sarbani_...@rediffmail.com
Subject: [gmx-users] g_nmeig_d error: can't allocate region

Hi,

  I am trying to do normal mode analysis on a protein having 6398 atoms in 
vaccum.



I tried to energy minimize the structure using steepest descent, followed by 
l-bfgs 

minimization. the .mdp file I used is



define   = -DFLEXIBLE

constraints  = none

integrator   = l-bfgs

tinit= 0

nsteps   = 15000

nbfgscorr= 50

emtol= .001

emstep   = 0.1

gen_vel  = yes

gen-temp = 300

nstcomm  =  1

; NEIGHBORSEARCHING PARAMETERS

; nblist update frequency

nstlist  = 0

; ns algorithm (simple or grid)

ns-type  = simple

; Periodic boundary conditions: xyz (default), no (vacuum)

; or full (infinite systems only)

pbc  = no

; nblist cut-off

rlist= 0

domain-decomposition = no

; OPTIONS FOR ELECTROSTATICS AND VDW

; Method for doing electrostatics

coulombtype  = Cut-Off

rcoulomb-switch  = 0

rcoulomb = 0

; Dielectric constant (DC) for cut-off or DC of reaction field

epsilon-r= 1

; Method for doing Van der Waals

vdw-type = Cut-off

; cut-off lengths

rvdw-switch  = 0

rvdw = 0





after running the 15000 steps the Fmax was:

Low-Memory BFGS Minimizer converged to Fmax  0.001 in 10197 steps

Potential Energy  = -8.26391832320506e+04

Maximum force =  9.37558560558845e-04 on atom 4562

Norm of force =  2.24887722104890e-04





Again the l-bfgs minimization was run using the same .mdp file( with emtol = 
0.01)



the output was'

Low-Memory BFGS Minimizer converged to Fmax  1e-06 in 4143 steps

Potential Energy  = -8.26391832324998e+04

Maximum force =  9.67927896882578e-07 on atom 3271

Norm of force =  1.70637151528245e-07







After this I prepared the nm.mdp  file for NMA, where I used exactly the same 
parameters 

as the ones used in lbfgs energy minimization( with integrator = nm)



the commands that were used were:

grompp_d -f new_nm.mdp -t new_lbfgs_2.trr -c new_lbfgs_2.gro -o new_nm.tpr 
-zero -p 

../topol.top



nohup mdrun_d -v -s new_nm.tpr -deffnm new_nm -mtx new_nm.mtx 





nohup.out had the following message:

Non-cutoff electrostatics used, forcing full Hessian format.Allocating Hessian 

memory...starting normal mode calculation 'Protein'6398 steps.Maximum force: 
9.67928e-

07





The run ended successfully:



Then i used the command 

g_nmeig_d -f new_nm.mtx -s new_nm.tpr -ol eigenvalue.xvg -v eigenvector.trr



I get the following error:

Reading file new_nm.tpr, VERSION 4.0.7 (double precision)

Reading file new_nm.tpr, VERSION 4.0.7 (double precision)

Reading double precision matrix generated by Gromacs VERSION 4.0.7

Full matrix storage format, nrow=19194, ncols=19194



Diagonalizing to find vectors 1 through 50...

g_nmeig_d(1892) malloc: *** mmap(size=18446744072353271808) failed (error 
code=12)

*** error: can't allocate region

*** set a breakpoint in malloc_error_break to debug



I am not being able to understand the problem. the computer  has a 16gb memory







If I use different parameters in the nm.mdp file as



rlist= 1.5

domain-decomposition = no

; OPTIONS FOR ELECTROSTATICS AND VDW

; Method for doing electrostatics

coulombtype  = switch

rcoulomb-switch  = 1

rcoulomb = 1.2

; Dielectric constant (DC) for cut-off or DC of reaction field

epsilon-r= 1

; Method for doing Van der Waals

vdw-type = switch

; cut-off lengths

rvdw-switch  = 1

rvdw = 1.2





then i get the message :Maximum force: 3.14171e+03

Maximum force probably not small enough to ensure that you are in an

energy well. Be aware that negative eigenvalues may occur when the

resulting matrix is diagonalized.





I am sorry to post such a lengthy  query, but I have no clue about the root of 
the problem.

Any suggestion  will be of great help.

Thanks in advance,

Sarbani. 
  
_
New Windows 7: Simplify what you do everyday. Find the right PC for you.
http://windows.microsoft.com/shop-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send 

[gmx-users] Re: Minimizing a structure having a heteroatom

2010-03-02 Thread bharat gupta
Hi all

I have modeled a structure with the active site residue as a
heteroatom (formyl glycine) and when I am minimizing the structure
gromacs is giving error that it cannot minimize it .. can anybody tell
me how can i minimize it ...

thanks

-- 
Bharat
M.Sc. Bioinformatics (Final year)
Centre for Bioinformatics
Pondicherry University
Puducherry
India
Mob. +919962670525
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] Re: Minimizing a structure having a heteroatom

2010-03-02 Thread Mark Abraham

On 3/03/2010 10:13 AM, bharat gupta wrote:

Hi all

I have modeled a structure with the active site residue as a
heteroatom (formyl glycine) and when I am minimizing the structure
gromacs is giving error that it cannot minimize it .. can anybody tell
me how can i minimize it ...


No, because you haven't told us any useful diagnostics.

Mark
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Amit Choubey

 Hi Mark,

 I quoted the memory usage requirements from a presentation by Berk Hess,
 Following is the link to it


http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf


 l. In that presentation on pg 27,28 Berk does talk about memory usage but
 then I am not sure if he referred to any other specific thing.

 My system only contains SPC water. I want Berendsen T coupling and Coulomb
 interaction with Reaction Field.

 I just want a rough estimate of how big of a system of water can be
 simulated on our super computers.

 Thank you,

 Amit


On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham mark.abra...@anu.edu.auwrote:

 - Original Message -
 From: Amit Choubey kgp.a...@gmail.com
 Date: Saturday, February 27, 2010 10:17
 Subject: Re: [gmx-users] gromacs memory usage
 To: Discussion list for GROMACS users gmx-users@gromacs.org

  Hi Mark,
  We have few nodes with 64 GB memory and many other with 16 GB of memory.
 I am attempting a simulation of around 100 M atoms.

 Well, try some smaller systems and work upwards to see if you have a limit
 in practice. 50K atoms can be run in less than 32GB over 64 processors. You
 didn't say whether your simulation system can run on 1 processor... if it
 does, then you can be sure the problem really is related to parallelism.

  I did find some document which says one need (50bytes)*NATOMS on master
 node, also one needs
   (100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute nodes. Is
 this true?

 In general, no. It will vary with the simulation algorithm you're using.
 Quoting such without attributing the source or describing the context is
 next to useless. You also dropped a parenthesis.

 Mark
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php

-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Re: [gmx-users] problem with the size of freeze groups

2010-03-02 Thread jampani srinivas
Dear Berk,

Thanks for your previous responses, Can you please let me know if you have
any solution for the size of freezing groups? I am still not able to do if i
have larger freezing groups.

Thanks again for your kind help.
srinivas.


On Fri, Feb 26, 2010 at 5:37 PM, jampani srinivas jampa...@gmail.comwrote:

 Dear Berk,

 I have checked my inputs and tcl scripts that i have used for
 the selection, i could see that my selection doesn't have any problem.
 I submitted it again still i am getting the same log   file with nrdf 0 for
 non-freezing group. Please let me know if you want see any of my input
 files, and help me if you have solution for this problem.

 Thanks and Regards
 Srinivas.


 On Fri, Feb 26, 2010 at 12:39 PM, jampani srinivas jampa...@gmail.comwrote:

 Dear Berk,

 I am using VERSION 4.0.5. As you said if there is no problem i should get
 it correctly, i don't know where it is going wrong. I have written a small
 script in tcl to use in vmd to get my selections. i will check the script
 and the selection again. I will let you know my results again.

 Thanks for your valuable time and kind help.
 Srinivas.

 On Fri, Feb 26, 2010 at 12:29 PM, Berk Hess g...@hotmail.com wrote:

  Hi,

 Which version of Gromacs are you using?
 I can't see any issues in the 4.0 code, but some older version might have
 problems.

 Berk

 --
 Date: Fri, 26 Feb 2010 12:05:56 -0500

 Subject: Re: [gmx-users] problem with the size of freeze groups
 From: jampa...@gmail.com
 To: gmx-users@gromacs.org

 Dear Berk,

 They are same, freeze and Tmp2 are exactly the same groups. I just put
 them like that for my convenience, just to avoid confusion in my second
 email i made it uniform.

  Thanks
 Srinivas.

 On Fri, Feb 26, 2010 at 11:59 AM, Berk Hess g...@hotmail.com wrote:

  That is what I suspected, by I don't know why this is.

 Are you really sure you made a temperature coupling group
 that is exactly the freeze group?
 This first mdp file you mailed had a different group names for the freeze
 group
 and the tcoupl groups.

 Berk

 --
 Date: Fri, 26 Feb 2010 11:53:49 -0500

 Subject: Re: [gmx-users] problem with the size of freeze groups
 From: jampa...@gmail.com
 To: gmx-users@gromacs.org

 Dear Berk,

 It looks to me some thing is wrong when i change the radius from 35 to
 25,  herewith i am giving grpopts for both systems


 +
 grpopts: (system with 35 A)
nrdf: 33141.4 0
ref_t: 300   0
tau_t: 0.1 0.1
 +

 grpopts: (system with 25A)
nrdf:   0  0
ref_t: 300   0
 tau_t: 0.1 0.1
 

I think some thing is going wrong when the size of freezing group is
 increased. I don't know whether my understand is correct or not.



 Thanks
 Srinivas.


 On Fri, Feb 26, 2010 at 11:01 AM, Berk Hess g...@hotmail.com wrote:

  Ah, but that does not correspond to the mdp options tou mailed.
 Here there is only one group with 0 degrees of freedom and reference
 temperature 0.

 Berk

 --
 Date: Fri, 26 Feb 2010 10:50:13 -0500

 Subject: Re: [gmx-users] problem with the size of freeze groups
 From: jampa...@gmail.com
 To: gmx-users@gromacs.org

 HI

 Thanks, My log file shows me nrdf: 0

 ###

grpopts:
nrdf: 0
ref_t:0
tau_t:   0

 ###

 Thanks
 Srinivas.

 On Fri, Feb 26, 2010 at 10:25 AM, Berk Hess g...@hotmail.com wrote:

  Hi,

 Then I have no clue what might be wrong.
 Have you check nrdf in the log file?

 Berk

 --
 Date: Fri, 26 Feb 2010 09:54:22 -0500
 Subject: Re: [gmx-users] problem with the size of freeze groups

 From: jampa...@gmail.com
 To: gmx-users@gromacs.org

 Dear Berk,

 Thanks for your response, As you mentioned i have separated t-coupling
 group for frozen and non-frozen groups, still the result is same.
 Herewith i am giving my md.mdp file, Can you suggest me if i am missing
 any options in  my md.mdp file?

 Thanks again
 Srinivas.

 md.mdp file

 +++
 title   = AB2130
 cpp = /usr/bin/cpp
 constraints = all-bonds
 integrator  = md
 dt  = 0.002 ; ps !
 nsteps  = 150 ; total 3.0 ns.
 nstcomm = 1
 nstxout = 1000 ; collect data every 2.0 ps
 nstvout = 1000 ; collect velocity every 2.0 ps
 nstfout = 0
 nstlog  = 0
 nstenergy   = 1000 ; collect energy   every 2.0 ps
 nstlist = 10
 ns_type = grid
 rlist   = 1.0
 coulombtype = PME
 rcoulomb= 1.0
 rvdw= 1.0
 rvdw_switch = 0.9
 fourierspacing  = 0.12
 fourier_nx  = 0
 fourier_ny  = 0
 fourier_nz  = 0
 pme_order   = 4
 ewald_rtol  = 1e-5
 optimize_fft= yes
 ; Berendsen 

Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Mark Abraham

On 3/03/2010 12:53 PM, Amit Choubey wrote:

Hi Mark,

I quoted the memory usage requirements from a presentation by Berk
Hess, Following is the link to it


http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf

l. In that presentation on pg 27,28 Berk does talk about memory
usage but then I am not sure if he referred to any other specific thing.

My system only contains SPC water. I want Berendsen T coupling and
Coulomb interaction with Reaction Field.

I just want a rough estimate of how big of a system of water can be
simulated on our super computers.


Try increasingly large systems until it runs out of memory. There's your 
answer.


Mark


On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham mark.abra...@anu.edu.au
mailto:mark.abra...@anu.edu.au wrote:

- Original Message -
From: Amit Choubey kgp.a...@gmail.com mailto:kgp.a...@gmail.com
Date: Saturday, February 27, 2010 10:17
Subject: Re: [gmx-users] gromacs memory usage
To: Discussion list for GROMACS users gmx-users@gromacs.org
mailto:gmx-users@gromacs.org

  Hi Mark,
  We have few nodes with 64 GB memory and many other with 16 GB of
memory. I am attempting a simulation of around 100 M atoms.

Well, try some smaller systems and work upwards to see if you have a
limit in practice. 50K atoms can be run in less than 32GB over 64
processors. You didn't say whether your simulation system can run on
1 processor... if it does, then you can be sure the problem really
is related to parallelism.

  I did find some document which says one need (50bytes)*NATOMS on
master node, also one needs
   (100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
nodes. Is this true?

In general, no. It will vary with the simulation algorithm you're
using. Quoting such without attributing the source or describing the
context is next to useless. You also dropped a parenthesis.

Mark
--
gmx-users mailing list gmx-users@gromacs.org
mailto:gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before
posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org
mailto:gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php



--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Amit Choubey
Hi Mark,

Yes thats one way to go about it. But it would have been great if i could
get a rough estimation.

Thank you.

amit


On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham mark.abra...@anu.edu.auwrote:

 On 3/03/2010 12:53 PM, Amit Choubey wrote:

Hi Mark,

I quoted the memory usage requirements from a presentation by Berk
Hess, Following is the link to it



 http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf

l. In that presentation on pg 27,28 Berk does talk about memory
usage but then I am not sure if he referred to any other specific
 thing.

My system only contains SPC water. I want Berendsen T coupling and
Coulomb interaction with Reaction Field.

I just want a rough estimate of how big of a system of water can be
simulated on our super computers.


 Try increasingly large systems until it runs out of memory. There's your
 answer.

 Mark

  On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham mark.abra...@anu.edu.au
 mailto:mark.abra...@anu.edu.au wrote:

- Original Message -
From: Amit Choubey kgp.a...@gmail.com mailto:kgp.a...@gmail.com
Date: Saturday, February 27, 2010 10:17
Subject: Re: [gmx-users] gromacs memory usage
To: Discussion list for GROMACS users gmx-users@gromacs.org
mailto:gmx-users@gromacs.org

  Hi Mark,
  We have few nodes with 64 GB memory and many other with 16 GB of
memory. I am attempting a simulation of around 100 M atoms.

Well, try some smaller systems and work upwards to see if you have a
limit in practice. 50K atoms can be run in less than 32GB over 64
processors. You didn't say whether your simulation system can run on
1 processor... if it does, then you can be sure the problem really
is related to parallelism.

  I did find some document which says one need (50bytes)*NATOMS on
master node, also one needs
   (100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
nodes. Is this true?

In general, no. It will vary with the simulation algorithm you're
using. Quoting such without attributing the source or describing the
context is next to useless. You also dropped a parenthesis.

Mark
--
gmx-users mailing list gmx-users@gromacs.org
mailto:gmx-users@gromacs.org

http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before
posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org
mailto:gmx-users-requ...@gromacs.org.

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


  --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the www interface
 or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php

-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Tsjerk Wassenaar
Hi Amit,

I think the presentation gives right what you want: a rough estimate.
Now as Berk pointed out, to allocate more than 2GB of memory, you need
to compile in 64bit. Then, if you want to have a real feel for the
memory usage, there's no other way than trying. But fortunately, the
memory requirements of a (very) long simulation are equal to that of a
very short one, so it doesn't need to cost much time.

Cheers,

Tsjerk

On Wed, Mar 3, 2010 at 5:31 AM, Amit Choubey kgp.a...@gmail.com wrote:
 Hi Mark,

 Yes thats one way to go about it. But it would have been great if i could
 get a rough estimation.

 Thank you.

 amit


 On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham mark.abra...@anu.edu.au
 wrote:

 On 3/03/2010 12:53 PM, Amit Choubey wrote:

    Hi Mark,

    I quoted the memory usage requirements from a presentation by Berk
    Hess, Following is the link to it



 http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf

    l. In that presentation on pg 27,28 Berk does talk about memory
    usage but then I am not sure if he referred to any other specific
 thing.

    My system only contains SPC water. I want Berendsen T coupling and
    Coulomb interaction with Reaction Field.

    I just want a rough estimate of how big of a system of water can be
    simulated on our super computers.

 Try increasingly large systems until it runs out of memory. There's your
 answer.

 Mark

 On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham mark.abra...@anu.edu.au
 mailto:mark.abra...@anu.edu.au wrote:

    - Original Message -
    From: Amit Choubey kgp.a...@gmail.com mailto:kgp.a...@gmail.com
    Date: Saturday, February 27, 2010 10:17
    Subject: Re: [gmx-users] gromacs memory usage
    To: Discussion list for GROMACS users gmx-users@gromacs.org
    mailto:gmx-users@gromacs.org

      Hi Mark,
      We have few nodes with 64 GB memory and many other with 16 GB of
    memory. I am attempting a simulation of around 100 M atoms.

    Well, try some smaller systems and work upwards to see if you have a
    limit in practice. 50K atoms can be run in less than 32GB over 64
    processors. You didn't say whether your simulation system can run on
    1 processor... if it does, then you can be sure the problem really
    is related to parallelism.

      I did find some document which says one need (50bytes)*NATOMS on
    master node, also one needs
       (100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
    nodes. Is this true?

    In general, no. It will vary with the simulation algorithm you're
    using. Quoting such without attributing the source or describing the
    context is next to useless. You also dropped a parenthesis.

    Mark
    --
    gmx-users mailing list gmx-users@gromacs.org
    mailto:gmx-users@gromacs.org
    http://lists.gromacs.org/mailman/listinfo/gmx-users
    Please search the archive at http://www.gromacs.org/search before
    posting!
    Please don't post (un)subscribe requests to the list. Use the
    www interface or send it to gmx-users-requ...@gromacs.org
    mailto:gmx-users-requ...@gromacs.org.
    Can't post? Read http://www.gromacs.org/mailing_lists/users.php


 --
 gmx-users mailing list    gmx-us...@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the www
 interface or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php


 --
 gmx-users mailing list    gmx-us...@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php




-- 
Tsjerk A. Wassenaar, Ph.D.

Computational Chemist
Medicinal Chemist
Neuropharmacologist
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Roland Schulz
Hi,

last time I checked (summer) I got
40bytes per atom
and 294byes per atom/core (RF with 12Å cut-off)

100M atoms works with that cut-off on 128 16GB nodes with 8 cores. I haven't
tried on less than 128 nodes. (See http://cmb.ornl.gov/research/petascale-md
)

We could relatively easy fix the 40bytes per atom (no one had time so far to
work on it) but I don't think there is much which can be done about the
294bytes per atom/core.

On how many nodes do you want to simulate? Thus are you limited by the
40bytes per atom or the 294bytes per atom/core?

Roland



On Tue, Mar 2, 2010 at 11:31 PM, Amit Choubey kgp.a...@gmail.com wrote:

 Hi Mark,

 Yes thats one way to go about it. But it would have been great if i could
 get a rough estimation.

 Thank you.

 amit



 On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham mark.abra...@anu.edu.auwrote:

 On 3/03/2010 12:53 PM, Amit Choubey wrote:

Hi Mark,

I quoted the memory usage requirements from a presentation by Berk
Hess, Following is the link to it



 http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf

l. In that presentation on pg 27,28 Berk does talk about memory
usage but then I am not sure if he referred to any other specific
 thing.

My system only contains SPC water. I want Berendsen T coupling and
Coulomb interaction with Reaction Field.

I just want a rough estimate of how big of a system of water can be
simulated on our super computers.


 Try increasingly large systems until it runs out of memory. There's your
 answer.

 Mark

  On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham mark.abra...@anu.edu.au
 mailto:mark.abra...@anu.edu.au wrote:

- Original Message -
From: Amit Choubey kgp.a...@gmail.com mailto:kgp.a...@gmail.com
Date: Saturday, February 27, 2010 10:17
Subject: Re: [gmx-users] gromacs memory usage
To: Discussion list for GROMACS users gmx-users@gromacs.org
mailto:gmx-users@gromacs.org

  Hi Mark,
  We have few nodes with 64 GB memory and many other with 16 GB of
memory. I am attempting a simulation of around 100 M atoms.

Well, try some smaller systems and work upwards to see if you have a
limit in practice. 50K atoms can be run in less than 32GB over 64
processors. You didn't say whether your simulation system can run on
1 processor... if it does, then you can be sure the problem really
is related to parallelism.

  I did find some document which says one need (50bytes)*NATOMS on
master node, also one needs
   (100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
nodes. Is this true?

In general, no. It will vary with the simulation algorithm you're
using. Quoting such without attributing the source or describing the
context is next to useless. You also dropped a parenthesis.

Mark
--
gmx-users mailing list gmx-users@gromacs.org
mailto:gmx-users@gromacs.org

http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before
posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org
mailto:gmx-users-requ...@gromacs.org.

Can't post? Read http://www.gromacs.org/mailing_lists/users.php


  --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before
 posting!
 Please don't post (un)subscribe requests to the list. Use the www
 interface or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php



 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php




-- 
ORNL/UT Center for Molecular Biophysics cmb.ornl.gov
865-241-1537, ORNL PO BOX 2008 MS6309
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Amit Choubey
Hi Tsjerk,

I tried to do a test run based on the presentation. But there was a memory
related error (I had given a leverage of more than 2 GB).

I did not understand the 64 bit issue, could you let me know wheres the
documentation? I need to look into that.

Thank you,
amit

On Tue, Mar 2, 2010 at 9:14 PM, Tsjerk Wassenaar tsje...@gmail.com wrote:

 Hi Amit,

 I think the presentation gives right what you want: a rough estimate.
 Now as Berk pointed out, to allocate more than 2GB of memory, you need
 to compile in 64bit. Then, if you want to have a real feel for the
 memory usage, there's no other way than trying. But fortunately, the
 memory requirements of a (very) long simulation are equal to that of a
 very short one, so it doesn't need to cost much time.

 Cheers,

 Tsjerk

 On Wed, Mar 3, 2010 at 5:31 AM, Amit Choubey kgp.a...@gmail.com wrote:
  Hi Mark,
 
  Yes thats one way to go about it. But it would have been great if i could
  get a rough estimation.
 
  Thank you.
 
  amit
 
 
  On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham mark.abra...@anu.edu.au
  wrote:
 
  On 3/03/2010 12:53 PM, Amit Choubey wrote:
 
 Hi Mark,
 
 I quoted the memory usage requirements from a presentation by Berk
 Hess, Following is the link to it
 
 
 
 
 http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf
 
 l. In that presentation on pg 27,28 Berk does talk about memory
 usage but then I am not sure if he referred to any other specific
  thing.
 
 My system only contains SPC water. I want Berendsen T coupling and
 Coulomb interaction with Reaction Field.
 
 I just want a rough estimate of how big of a system of water can be
 simulated on our super computers.
 
  Try increasingly large systems until it runs out of memory. There's your
  answer.
 
  Mark
 
  On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham mark.abra...@anu.edu.au
  mailto:mark.abra...@anu.edu.au wrote:
 
 - Original Message -
 From: Amit Choubey kgp.a...@gmail.com mailto:kgp.a...@gmail.com
 Date: Saturday, February 27, 2010 10:17
 Subject: Re: [gmx-users] gromacs memory usage
 To: Discussion list for GROMACS users gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 
   Hi Mark,
   We have few nodes with 64 GB memory and many other with 16 GB of
 memory. I am attempting a simulation of around 100 M atoms.
 
 Well, try some smaller systems and work upwards to see if you have a
 limit in practice. 50K atoms can be run in less than 32GB over 64
 processors. You didn't say whether your simulation system can run on
 1 processor... if it does, then you can be sure the problem really
 is related to parallelism.
 
   I did find some document which says one need (50bytes)*NATOMS on
 master node, also one needs
(100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
 nodes. Is this true?
 
 In general, no. It will vary with the simulation algorithm you're
 using. Quoting such without attributing the source or describing the
 context is next to useless. You also dropped a parenthesis.
 
 Mark
 --
 gmx-users mailing list gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before
 posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org
 mailto:gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at http://www.gromacs.org/search before
 posting!
  Please don't post (un)subscribe requests to the list. Use the www
  interface or send it to gmx-users-requ...@gromacs.org.
  Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at http://www.gromacs.org/search before
 posting!
  Please don't post (un)subscribe requests to the list. Use the
  www interface or send it to gmx-users-requ...@gromacs.org.
  Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 



 --
 Tsjerk A. Wassenaar, Ph.D.

 Computational Chemist
 Medicinal Chemist
 Neuropharmacologist
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php

-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 

Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Roland Schulz
Hi,

do:
file `which mdrun`
and it should give:
/usr/bin/mdrun: ELF 64-bit LSB executable, x86-64, version 1 (SYSV),
dynamically linked (uses shared libs), for GNU/Linux 2.6.15, stripped

If it is not 64 you need to compile with 64 and have a 64bit kernel. Since
you asked before about 2GB large files this might indeed be your problem.

Roland

On Wed, Mar 3, 2010 at 12:48 AM, Amit Choubey kgp.a...@gmail.com wrote:

 Hi Tsjerk,

 I tried to do a test run based on the presentation. But there was a memory
 related error (I had given a leverage of more than 2 GB).

 I did not understand the 64 bit issue, could you let me know wheres the
 documentation? I need to look into that.

 Thank you,
 amit


 On Tue, Mar 2, 2010 at 9:14 PM, Tsjerk Wassenaar tsje...@gmail.comwrote:

 Hi Amit,

 I think the presentation gives right what you want: a rough estimate.
 Now as Berk pointed out, to allocate more than 2GB of memory, you need
 to compile in 64bit. Then, if you want to have a real feel for the
 memory usage, there's no other way than trying. But fortunately, the
 memory requirements of a (very) long simulation are equal to that of a
 very short one, so it doesn't need to cost much time.

 Cheers,

 Tsjerk

 On Wed, Mar 3, 2010 at 5:31 AM, Amit Choubey kgp.a...@gmail.com wrote:
  Hi Mark,
 
  Yes thats one way to go about it. But it would have been great if i
 could
  get a rough estimation.
 
  Thank you.
 
  amit
 
 
  On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham mark.abra...@anu.edu.au
  wrote:
 
  On 3/03/2010 12:53 PM, Amit Choubey wrote:
 
 Hi Mark,
 
 I quoted the memory usage requirements from a presentation by Berk
 Hess, Following is the link to it
 
 
 
 
 http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf
 
 l. In that presentation on pg 27,28 Berk does talk about memory
 usage but then I am not sure if he referred to any other specific
  thing.
 
 My system only contains SPC water. I want Berendsen T coupling and
 Coulomb interaction with Reaction Field.
 
 I just want a rough estimate of how big of a system of water can be
 simulated on our super computers.
 
  Try increasingly large systems until it runs out of memory. There's
 your
  answer.
 
  Mark
 
  On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham 
 mark.abra...@anu.edu.au
  mailto:mark.abra...@anu.edu.au wrote:
 
 - Original Message -
 From: Amit Choubey kgp.a...@gmail.com mailto:kgp.a...@gmail.com
 
 Date: Saturday, February 27, 2010 10:17
 Subject: Re: [gmx-users] gromacs memory usage
 To: Discussion list for GROMACS users gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 
   Hi Mark,
   We have few nodes with 64 GB memory and many other with 16 GB of
 memory. I am attempting a simulation of around 100 M atoms.
 
 Well, try some smaller systems and work upwards to see if you have
 a
 limit in practice. 50K atoms can be run in less than 32GB over 64
 processors. You didn't say whether your simulation system can run
 on
 1 processor... if it does, then you can be sure the problem really
 is related to parallelism.
 
   I did find some document which says one need (50bytes)*NATOMS on
 master node, also one needs
(100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
 nodes. Is this true?
 
 In general, no. It will vary with the simulation algorithm you're
 using. Quoting such without attributing the source or describing
 the
 context is next to useless. You also dropped a parenthesis.
 
 Mark
 --
 gmx-users mailing list gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before
 posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org
 mailto:gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at http://www.gromacs.org/search before
 posting!
  Please don't post (un)subscribe requests to the list. Use the www
  interface or send it to gmx-users-requ...@gromacs.org.
  Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at http://www.gromacs.org/search before
 posting!
  Please don't post (un)subscribe requests to the list. Use the
  www interface or send it to gmx-users-requ...@gromacs.org.
  Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 



 --
 Tsjerk A. Wassenaar, Ph.D.

 Computational Chemist
 Medicinal Chemist
 Neuropharmacologist
 --
 gmx-users mailing listgmx-users@gromacs.org
 

Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Amit Choubey
Hi Roland

I tried 'which mdrun' but it only gives the path name of installation. Is
there any other way to know if the installation is 64 bit ot not?

Thank you,
Amit

On Tue, Mar 2, 2010 at 10:03 PM, Roland Schulz rol...@utk.edu wrote:

 Hi,

 do:
 file `which mdrun`
 and it should give:
 /usr/bin/mdrun: ELF 64-bit LSB executable, x86-64, version 1 (SYSV),
 dynamically linked (uses shared libs), for GNU/Linux 2.6.15, stripped

 If it is not 64 you need to compile with 64 and have a 64bit kernel. Since
 you asked before about 2GB large files this might indeed be your problem.

 Roland

 On Wed, Mar 3, 2010 at 12:48 AM, Amit Choubey kgp.a...@gmail.com wrote:

 Hi Tsjerk,

 I tried to do a test run based on the presentation. But there was a memory
 related error (I had given a leverage of more than 2 GB).

 I did not understand the 64 bit issue, could you let me know wheres the
 documentation? I need to look into that.

 Thank you,
 amit


 On Tue, Mar 2, 2010 at 9:14 PM, Tsjerk Wassenaar tsje...@gmail.comwrote:

 Hi Amit,

 I think the presentation gives right what you want: a rough estimate.
 Now as Berk pointed out, to allocate more than 2GB of memory, you need
 to compile in 64bit. Then, if you want to have a real feel for the
 memory usage, there's no other way than trying. But fortunately, the
 memory requirements of a (very) long simulation are equal to that of a
 very short one, so it doesn't need to cost much time.

 Cheers,

 Tsjerk

 On Wed, Mar 3, 2010 at 5:31 AM, Amit Choubey kgp.a...@gmail.com wrote:
  Hi Mark,
 
  Yes thats one way to go about it. But it would have been great if i
 could
  get a rough estimation.
 
  Thank you.
 
  amit
 
 
  On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham mark.abra...@anu.edu.au
  wrote:
 
  On 3/03/2010 12:53 PM, Amit Choubey wrote:
 
 Hi Mark,
 
 I quoted the memory usage requirements from a presentation by Berk
 Hess, Following is the link to it
 
 
 
 
 http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf
 
 l. In that presentation on pg 27,28 Berk does talk about memory
 usage but then I am not sure if he referred to any other specific
  thing.
 
 My system only contains SPC water. I want Berendsen T coupling and
 Coulomb interaction with Reaction Field.
 
 I just want a rough estimate of how big of a system of water can
 be
 simulated on our super computers.
 
  Try increasingly large systems until it runs out of memory. There's
 your
  answer.
 
  Mark
 
  On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham 
 mark.abra...@anu.edu.au
  mailto:mark.abra...@anu.edu.au wrote:
 
 - Original Message -
 From: Amit Choubey kgp.a...@gmail.com mailto:kgp.a...@gmail.com
 
 Date: Saturday, February 27, 2010 10:17
 Subject: Re: [gmx-users] gromacs memory usage
 To: Discussion list for GROMACS users gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 
   Hi Mark,
   We have few nodes with 64 GB memory and many other with 16 GB
 of
 memory. I am attempting a simulation of around 100 M atoms.
 
 Well, try some smaller systems and work upwards to see if you have
 a
 limit in practice. 50K atoms can be run in less than 32GB over 64
 processors. You didn't say whether your simulation system can run
 on
 1 processor... if it does, then you can be sure the problem really
 is related to parallelism.
 
   I did find some document which says one need (50bytes)*NATOMS
 on
 master node, also one needs
(100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
 nodes. Is this true?
 
 In general, no. It will vary with the simulation algorithm you're
 using. Quoting such without attributing the source or describing
 the
 context is next to useless. You also dropped a parenthesis.
 
 Mark
 --
 gmx-users mailing list gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/search before
 posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org
 mailto:gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at http://www.gromacs.org/search before
 posting!
  Please don't post (un)subscribe requests to the list. Use the www
  interface or send it to gmx-users-requ...@gromacs.org.
  Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at http://www.gromacs.org/search before
 posting!
  Please don't post (un)subscribe requests to the list. Use the
  www interface or send it to 

Re: [gmx-users] gromacs memory usage

2010-03-02 Thread Roland Schulz
Amit,

try the full line (with the file)

Roland

On Wed, Mar 3, 2010 at 1:22 AM, Amit Choubey kgp.a...@gmail.com wrote:

 Hi Roland

 I tried 'which mdrun' but it only gives the path name of installation. Is
 there any other way to know if the installation is 64 bit ot not?

 Thank you,
 Amit


 On Tue, Mar 2, 2010 at 10:03 PM, Roland Schulz rol...@utk.edu wrote:

 Hi,

 do:
 file `which mdrun`
 and it should give:
 /usr/bin/mdrun: ELF 64-bit LSB executable, x86-64, version 1 (SYSV),
 dynamically linked (uses shared libs), for GNU/Linux 2.6.15, stripped

 If it is not 64 you need to compile with 64 and have a 64bit kernel. Since
 you asked before about 2GB large files this might indeed be your problem.

 Roland

 On Wed, Mar 3, 2010 at 12:48 AM, Amit Choubey kgp.a...@gmail.com wrote:

 Hi Tsjerk,

 I tried to do a test run based on the presentation. But there was a
 memory related error (I had given a leverage of more than 2 GB).

 I did not understand the 64 bit issue, could you let me know wheres the
 documentation? I need to look into that.

 Thank you,
 amit


 On Tue, Mar 2, 2010 at 9:14 PM, Tsjerk Wassenaar tsje...@gmail.comwrote:

 Hi Amit,

 I think the presentation gives right what you want: a rough estimate.
 Now as Berk pointed out, to allocate more than 2GB of memory, you need
 to compile in 64bit. Then, if you want to have a real feel for the
 memory usage, there's no other way than trying. But fortunately, the
 memory requirements of a (very) long simulation are equal to that of a
 very short one, so it doesn't need to cost much time.

 Cheers,

 Tsjerk

 On Wed, Mar 3, 2010 at 5:31 AM, Amit Choubey kgp.a...@gmail.com
 wrote:
  Hi Mark,
 
  Yes thats one way to go about it. But it would have been great if i
 could
  get a rough estimation.
 
  Thank you.
 
  amit
 
 
  On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham mark.abra...@anu.edu.au
 
  wrote:
 
  On 3/03/2010 12:53 PM, Amit Choubey wrote:
 
 Hi Mark,
 
 I quoted the memory usage requirements from a presentation by
 Berk
 Hess, Following is the link to it
 
 
 
 
 http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf
 
 l. In that presentation on pg 27,28 Berk does talk about memory
 usage but then I am not sure if he referred to any other specific
  thing.
 
 My system only contains SPC water. I want Berendsen T coupling
 and
 Coulomb interaction with Reaction Field.
 
 I just want a rough estimate of how big of a system of water can
 be
 simulated on our super computers.
 
  Try increasingly large systems until it runs out of memory. There's
 your
  answer.
 
  Mark
 
  On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham 
 mark.abra...@anu.edu.au
  mailto:mark.abra...@anu.edu.au wrote:
 
 - Original Message -
 From: Amit Choubey kgp.a...@gmail.com mailto:
 kgp.a...@gmail.com
 Date: Saturday, February 27, 2010 10:17
 Subject: Re: [gmx-users] gromacs memory usage
 To: Discussion list for GROMACS users gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 
   Hi Mark,
   We have few nodes with 64 GB memory and many other with 16 GB
 of
 memory. I am attempting a simulation of around 100 M atoms.
 
 Well, try some smaller systems and work upwards to see if you
 have a
 limit in practice. 50K atoms can be run in less than 32GB over 64
 processors. You didn't say whether your simulation system can run
 on
 1 processor... if it does, then you can be sure the problem
 really
 is related to parallelism.
 
   I did find some document which says one need (50bytes)*NATOMS
 on
 master node, also one needs
(100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute
 nodes. Is this true?
 
 In general, no. It will vary with the simulation algorithm you're
 using. Quoting such without attributing the source or describing
 the
 context is next to useless. You also dropped a parenthesis.
 
 Mark
 --
 gmx-users mailing list gmx-users@gromacs.org
 mailto:gmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 Please search the archive at http://www.gromacs.org/searchbefore
 posting!
 Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org
 mailto:gmx-users-requ...@gromacs.org.
 Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at http://www.gromacs.org/search before
 posting!
  Please don't post (un)subscribe requests to the list. Use the www
  interface or send it to gmx-users-requ...@gromacs.org.
  Can't post? Read http://www.gromacs.org/mailing_lists/users.php
 
 
  --
  gmx-users mailing listgmx-users@gromacs.org
  http://lists.gromacs.org/mailman/listinfo/gmx-users
  Please search the archive at