Re: [gmx-users] make_edi, g_covar -nofit

2010-05-05 Thread Carsten Kutzner
git checkout --track -b release-4-0-patches origin/release-4-0-patches Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http

Re: [gmx-users] make_edi, g_covar -nofit

2010-05-04 Thread Carsten Kutzner
don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am

Re: [gmx-users] make_edi

2010-04-23 Thread Carsten Kutzner
Hi Vijaya, what version of Gromacs is this and how big do the trr files have to be so that the segv shows up? Carsten On Apr 22, 2010, at 6:56 PM, vijaya subramanian wrote: Hi When I run make_edi with a small eigenvec.trr file it works, but gives me a segmentation fault when I input

Re: [gmx-users] The problems of controling one atom through modifygromacs sourse code

2010-04-22 Thread Carsten Kutzner
/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten

Re: [gmx-users] Re: slow speed

2010-04-18 Thread Carsten Kutzner
On Apr 16, 2010, at 1:40 AM, Shuangxing Dai wrote: I am not running in parallel. Right now I just changed links order from 12 to 4. It is still slow. While I change to shift, not Ewald, it finished 1 steps in 10 mins. In the paper: J Comput Chem. 2005 Dec;26(16):1701-18. GROMACS:

Re: [gmx-users] Poor load balancing

2010-02-16 Thread Carsten Kutzner
1:18 (Mnbf/s) (GFlops) (ns/day) (hour/ns) Performance:398.725 22.539 22.158 1.083 Finished mdrun on node 0 Mon Feb 15 22:54:31 2010 On Mon, Feb 15, 2010 at 5:36 PM, Carsten Kutzner ckut...@gwdg.de wrote: Hi, 18 seconds real time is a bit

Re: [gmx-users] Poor load balancing

2010-02-15 Thread Carsten Kutzner
Hi, 18 seconds real time is a bit short for such a test. You should run at least several minutes. The performance you can expect depends a lot on the interconnect you are using. You will definitely need a really low-latency interconnect if you have less then 1000 atoms per core. Carsten On

Re: [gmx-users] linking gromcs to efence

2010-01-27 Thread Carsten Kutzner
Hi Jochen, it should work by putting it in the LDFLAGS. Either you should then get an executable that says something like Electric Fence 2.2.0 at the very start of execution or it should not compile when the library is not found. Carsten On Jan 27, 2010, at 5:32 PM, Jochen Hub wrote:

Re: [gmx-users] xtc file

2010-01-26 Thread Carsten Kutzner
On Jan 26, 2010, at 11:48 AM, Carla Jamous wrote: Hi everyone, Please I'm having a problem with mdrun: If I type: mdrun -v -s test.tpr -o test.trr -c test.pdb -x test.xtc -e test.edr -g test.log I never get the .xtc file. Can anyone tell me why what can I do to have an .xtc file?

Re: [gmx-users] Steered Molecular Dynamics (SMD) in Gromcas-4.0.5

2010-01-22 Thread Carsten Kutzner
Hi, you should set one pull group, not 700. The number of atoms in your pull group is 700. Freezing the pull group in x and y direction probably does what you want. Please also consider to upgrade to 4.0.7, which is the most recent stable version. Best, Carsten On Jan 22, 2010, at 7:41 AM,

Re: [gmx-users] Exceeding of Maximum allowed number of DD cells

2010-01-11 Thread Carsten Kutzner
On Jan 10, 2010, at 4:24 PM, Chao Zhang wrote: Dear GMX-Users, I'm testing my 256 full hydrated lipid on blue gene. The purpose is to find out the right number for -npme, as mdrun can not estimate itself successfully. You might find g_tune_pme useful, which is available in the git

Re: [gmx-users] make_edi fails with multiple eigenvectors for -linfix and -linacc

2009-12-24 Thread Carsten Kutzner
Hi Chris, On Dec 23, 2009, at 9:06 PM, chris.ne...@utoronto.ca wrote: Hello, I am having trouble getting make_edi -linfix to work with multiple eigenvectors. This works for a single EV: $ echo 3 | make_edi -s ../../SETUP/makeTPR/edi.tpr -f ../../SETUP/makeEDI/eigenvec.trr -o

Re: [gmx-users] multiple eigenvectors with -linfix and the application of eigenvalues

2009-12-21 Thread Carsten Kutzner
/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use thewww interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner

Re: [gmx-users] essential dynamics mdrun with SD integrator yields segmentation fault

2009-12-21 Thread Carsten Kutzner
. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne -- gmx-users mailing listgmx-users@gromacs.org

Re: [gmx-users] editconf.

2009-12-21 Thread Carsten Kutzner
On Dec 21, 2009, at 5:26 PM, david.lisgar...@canterbury.ac.uk david.lisgar...@canterbury.ac.uk wrote: Dear Users, Re Introductory tutorial; Trying to run the following: editconf -f out.gro -o fws_ctr.gro -center x/2 y/2 z/2 You have to give real coordinates for the center, not

Re: [gmx-users] reference for make_edi -linacc

2009-12-18 Thread Carsten Kutzner
the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute

Re: [gmx-users] How to tune number of CPUs for a run?

2009-11-04 Thread Carsten Kutzner
...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de

Re: [gmx-users] trying to get better performance in a Rocks cluster running GROMACS 4.0.4

2009-10-08 Thread Carsten Kutzner
you in advance. Flor Dra.M.Florencia Martini Laboratorio de Fisicoquímica de Membranas Lipídicas y Liposomas Cátedra de Química General e Inorgánica Facultad de Farmacia y Bioquímica Universidad de Buenos Aires Junín 956 2º (1113) TE: 54 011 4964-8249 int 24 --- El vie 25-sep-09, Carsten Kutzner

Re: [gmx-users] trying to get better performance in a Rocks cluster running GROMACS 4.0.4

2009-09-25 Thread Carsten Kutzner
to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49

Re: [gmx-users] About gromacs installation

2009-09-23 Thread Carsten Kutzner
On Sep 23, 2009, at 2:12 PM, Enamul Haque wrote: Hi gromacs experts, I am trying to install gromacs version 4.0 to my laptop running under ubuntu. But I can't. It shows some error like -- ./configure checking build system type... i686-pc-linux-gnulibc1 checking host system type...

Re: [gmx-users] how to save information even after stopping calculation

2009-09-22 Thread Carsten Kutzner
? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp

Re: [gmx-users] GMX 4.0.5 on Altix 3700 BX2: Assembly Optimizations fixed?

2009-09-17 Thread Carsten Kutzner
-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http

Re: [gmx-users] Re:Re: How to complie gromacs on sgi Altix450?

2009-09-15 Thread Carsten Kutzner
? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp

Re: [gmx-users] PME problems

2009-08-31 Thread Carsten Kutzner
before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical

Re: [gmx-users] Error: MPI Error

2009-08-11 Thread Carsten Kutzner
On Aug 11, 2009, at 11:52 AM, BSV Ramesh wrote: -- Dear All, I am getting the following error: Killed by signal 2 Killed by signal 2 Killed by signal 2 Killed by signal 2 Killed by signal 2 Killed by signal 2 Killed by signal 2 Killed by signal 2 . . . . Killed by signal 2 ,

Re: [gmx-users] Strange MPI problems. . .

2009-08-10 Thread Carsten Kutzner
or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551

Re: [gmx-users] Where are the source codes?

2009-07-07 Thread Carsten Kutzner
On Jul 7, 2009, at 9:28 AM, Chih-Ying Lin wrote: Hi I have installed the Gromacs and I want to see the source codes. Which directory can I find them? gromacs-4.0.5/src gromacs-4.0.5/include Carsten ___ gmx-users mailing list

Re: [gmx-users] about template.c?

2009-07-07 Thread Carsten Kutzner
On Jul 7, 2009, at 9:35 AM, Chih-Ying Lin wrote: Hi inside template.c = #include statutil.h #include typedefs.h #include smalloc.h #include vec.h #include copyrite.h #include statutil.h #include tpxio.h what are they? and, where are they? Go to the gromacs-4.0.5 directory and find out with

Re: [gmx-users] DSSP hangs for days

2009-06-29 Thread Carsten Kutzner
On Jun 28, 2009, at 10:46 AM, sharada wrote: Hi, I waited for it to finish for almost 5 days nothing happened except creation of those files. I had posted a similar mail some time back. Is there no solution for this. Is it something to do with speed of the system? I ran the program

Re: [gmx-users] DSSP hangs for days

2009-06-29 Thread Carsten Kutzner
On Jun 29, 2009, at 11:58 AM, sharada wrote: hello, I have downloaded the tar file from the github and extracted the contents. However I am unable to understand the README file as it is in a language other than english. Could you kindly provide the instructions how to go about using it.

Re: [gmx-users] parallel computing

2009-06-22 Thread Carsten Kutzner
On Jun 22, 2009, at 4:43 PM, akalabya bissoyi wrote: hello everybody i am running my gromacs in PC, it takes me lot of time for running my simulation. Can anybody help me regarding parallel computing using my PC so that my simulation will be faster. Any materials/protocol so that i can

Re: [gmx-users] PME nodes

2009-06-08 Thread Carsten Kutzner
/Gromacs/ Regards, Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne

Re: [gmx-users] PME nodes

2009-06-08 Thread Carsten Kutzner
Hi, it's written at the begin of the .c file: * You can compile this tool using the Gromacs Makefile from the * share/gromacs/template directory, just replace 'template' by 'g_tune_pme' * where needed. To enable shell completions for g_tune_pme, just * copy the provided completion.*

Re: [gmx-users] FEP source code

2009-04-24 Thread Carsten Kutzner
? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp

Re: [gmx-users] about parallel work

2009-04-21 Thread Carsten Kutzner
On Apr 21, 2009, at 5:04 PM, sheerychen wrote: Hello, every body. I have a question about parallel running of mdrun_mpi. I doubt that sometimes the parallel running of mdrun_mpi can not utilize the domain decomposition. This is the case when I use the batch work in the computer cluster

Re: [gmx-users] about parallel work

2009-04-21 Thread Carsten Kutzner
Hi, On Apr 21, 2009, at 5:53 PM, sheerychen wrote: yes, both versions are compiled as mpi version. However the start mpi messages are different. For MPICH, it would show that 1D domain decomposition like 3*1*1, and only 1 file would be produced. However for MPICH2, no such information

Re: [gmx-users] segmentation fault after mdrun

2009-02-23 Thread Carsten Kutzner
= 6 then. Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne

Re: [gmx-users] low latency

2009-01-14 Thread Carsten Kutzner
On Jan 13, 2009, at 9:15 PM, ha salem wrote: Dear Karsten and gromacs specialists I enabled flow control on hp procurve now I want to know how I can config low latency on the network ?is it required? thank you There are quite some parameters that affect Ethernet performance. They can be

Re: [gmx-users] Best performace with 0 core for PME calcuation

2009-01-12 Thread Carsten Kutzner
of a parallel simulation. Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne

Re: [gmx-users] dear KUTZNER (flow control)

2009-01-06 Thread Carsten Kutzner
to Enable with the space bar. More information will be available in the manual, see e.g. http://www.hp.com/rnd/support/manuals/2800.htm Carsten --- On Fri, 1/2/09, Carsten Kutzner ckut...@gwdg.de wrote: From: Carsten Kutzner ckut...@gwdg.de Subject: Re: [gmx-users] dear KUTZNER (flow control

Re: [gmx-users] dear KUTZNER (flow control)

2009-01-02 Thread Carsten Kutzner
Dear Ha Salem, in all the switches we tested at that time, flow control was disabled by default. You can connect to the switch (e.g. via telnet) and activate flow control. For the ProCurve switches flow control can be enabled and disabled for each of the ports individually. Carsten Am

Re: [gmx-users] Dear Sir !

2008-12-18 Thread Carsten Kutzner
On Dec 18, 2008, at 10:25 AM, Venkat Reddy wrote: how to save the coordinates of atoms at regular intervals that were generated during mdrun ?? Is it automatic or we need to modify .mdp file Hi, The 'nstxout' and 'nstxtcout' entries in the mdp file allow you to set how often the

Re: [gmx-users] [Fwd: [FFTW-announce] FFTW 3.2 is released]

2008-11-17 Thread Carsten Kutzner
? Read http://www.gromacs.org/mailing_lists/users.php Carsten Kutzner [EMAIL PROTECTED] ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before

Re: [gmx-users] Gromacs 4 Scaling Benchmarks...

2008-11-10 Thread Carsten Kutzner
Hi, most likely the Ethernet is the problem here. I compiled some numbers for the DPPC benchmark in the paper Speeding up parallel GROMACS on high-latency networks, http://www3.interscience.wiley.com/journal/114205207/abstract?CRETRY=1SRETRY=0 which are for version 3.3, but PME will behave

Re: [gmx-users] problem with lamboot

2008-11-03 Thread Carsten Kutzner
-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck

Re: [gmx-users] Fwd: still problem with lamboot

2008-11-03 Thread Carsten Kutzner
://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 www.mpibpc.mpg.de/home/grubmueller/ www.mpibpc.mpg.de/home

Re: [gmx-users] Improving scaling - Gromacs 4.0 RC2

2008-10-02 Thread Carsten Kutzner
Hi Justin, I have written a small gmx tool that tries various PME/PP balances systematically for a given number of nodes and afterwards gives a suggestion what the fastest combindation is. Although I plan to extend it with more functionality, it's already working and I can send it to you

Re: Fwd: [gmx-users] Performance problems with more than one node

2008-09-26 Thread Carsten Kutzner
gromacs version are you using? And have a look at the messages by Carsten Kutzner in this list, he wrote a lot on gromacs scaling. Jochen Best regards, Tiago Marques

Re: Fwd: [gmx-users] Performance problems with more than one node

2008-09-26 Thread Carsten Kutzner
connected via infiniband. With Thanks, Vivek 2008/9/26 Carsten Kutzner [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] Hi Tiago, if you swith off PME and suddenly your system scales, then the problems are likely to result from bad MPI_Alltoall performance. Maybe this is worth a check

Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories

2008-09-12 Thread Carsten Kutzner
is helpfull in figuring out the problem. Please, advice With Thanks, Vivek 2008/9/11 Carsten Kutzner [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] vivek sharma wrote: Hi There, I am running gromacs parellal version on cluster, with different -np options. Hi, which

Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories

2008-09-12 Thread Carsten Kutzner
paper. - Speeding up parallel GROMACS on high-latency networks, 2007, JCC, Vol 28, 12 - GROMACS 4: Algorithms for Highly Efficient, Load-Balanced, and Scalable Molecular Simulation, 2008, JCTC 4 (3) Hope that helps, Carsten With Thanks, Vivek 2008/9/12 Carsten Kutzner [EMAIL PROTECTED

Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories

2008-09-11 Thread Carsten Kutzner
/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck

Re: [gmx-users] invalid number of nodes

2008-08-13 Thread Carsten Kutzner
Hi Rebeca, Lines 69/70 of 3.3 include/types/simple.h reads /* Max number of nodes */ #define MAXNODES256 This obviously needs to be set to a higher value. There is also a MAXNODES parameter in CVS src/gmxlib/tpxio.c, which I guess also needs to be set to the same value if you

Re: [gmx-users] Problems with parallelisation of GROMACS on a Mac Pro (OS X)

2008-08-01 Thread Carsten Kutzner
/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/research/dep/grubmueller/ http://www.gwdg.de

Re: [gmx-users] Re: a tip to run gromacs from Fink in mac duo core

2008-07-23 Thread Carsten Kutzner
of Cambridge. 80 Tennis Court Road, Cambridge CB2 1GA, UK. http://www.bio.cam.ac.uk/~awd28 -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http

Re: [gmx-users] Does GROMACS have a multi-thread implementation?

2008-06-24 Thread Carsten Kutzner
Lee Soin wrote: Does GROMACS have a multi-thread implementation, instead of using MPI? No, at least not yet. Carsten ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at

Re: [gmx-users] Does GROMACS have a multi-thread implementation?

2008-06-24 Thread Carsten Kutzner
Lee Soin wrote: But I see that in mdrun there's an option nt--number of threads to start on each node. Does this mean multi-thread? Yes, nt means number of threads. It's already there for a future version. Carsten ___ gmx-users mailing list

[gmx-users] Re: The number of PME nodes

2008-06-23 Thread Carsten Kutzner
steps, might be longer). Carsten Best wishes! Ji Xu [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] 2008-06-23    -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077

Re: [gmx-users] mpirun problem

2008-06-16 Thread Carsten Kutzner
seen, I can say that you cannot expect any speedup if your computers are only connected with 100 Mpbs. You will need at least 1000 Mbps, or better Infiniband/Myrinet. Carsten thank you --- On *Sun, 6/15/08, Carsten Kutzner /[EMAIL PROTECTED]/* wrote: From: Carsten Kutzner [EMAIL PROTECTED

Re: [gmx-users] mpirun problem

2008-06-15 Thread Carsten Kutzner
On 15.06.2008, at 20:19, ha salem wrote: dear users I have encouneterd a problem with mpirun I have 2 pc (every pc has 1 intel quad core cpu) ,when I run mdrun on 1 machine with -np 4 option the calculation run on 4 cores and goes faster ,system monitor show all 4 cores

Re: [gmx-users] run parallel

2008-06-05 Thread Carsten Kutzner
. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/research/dep/grubmueller/ http://www.gwdg.de/~ckutzne

Re: [gmx-users] run parallel

2008-06-05 Thread Carsten Kutzner
the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313

Re: [gmx-users] about fourier grid

2008-04-17 Thread Carsten Kutzner
before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics

Re: [gmx-users] old patch for slow networks

2008-04-01 Thread Carsten Kutzner
/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck

Re: [gmx-users] Question about different versions of gromacs

2008-03-28 Thread Carsten Kutzner
Hi Nicolas, it is no problem to read 'older' tpr files with a newer version of gromacs. The other way round, it will probably not work - but gromacs will give you an error message then, displaying the version differences. Carsten Nicolas Martinez wrote: Hello gromacs users I am using

Re: [gmx-users] Gromacs slow for 23000 atom DPPC bilayer on a (1 x 4) node: 50 ps in 10 hours

2008-03-24 Thread Carsten Kutzner
Am 24.03.2008 um 10:17 schrieb maria goranovic: Hi Folks, My simulation is running too slow. It took 10 wall clock hours (40 cpu hours) for a short 50 ps simulation of a ~ 23000 atom DPPC bilayer. The hardware is a 4-cpu core. The installation is gromacs 3.3.1. I have run much larger

Re: [gmx-users] MPICH 1.2 vs. GMX 4.0

2008-02-29 Thread Carsten Kutzner
before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational

Re: [gmx-users] gmx_alltoall

2008-02-22 Thread Carsten Kutzner
Anna Marabotti wrote: Dear GMX-developers (in particular dear Carsten Kutzner), to overcome problems in making parallel runs with GROMACS on a Linux cluster with Gigabit Ethernet interconnection, I downloaded the package gmx_all-to-all to speed up the processes. In the instructions, however

Re: [gmx-users] Lam is not required for running parallel job

2008-02-13 Thread Carsten Kutzner
requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen

Re: [gmx-users] Address for the server of a REMD temperature calculator

2008-01-31 Thread Carsten Kutzner
http://folding.bmc.uu.se/remd/index.php It's in the news section of www.gromacs.org. Carsten OZGE ENGIN wrote: Hi all, Somebody has sent a mail which is about the address of a server in which the temperatures for a REMD simulation is calculated. However, I can not find this mail. Could

Re: [gmx-users] mdrun CVS version crashes instantly when run across nodes in parallel

2008-01-22 Thread Carsten Kutzner
(un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11

Re: [gmx-users] Gromacs installation: cannot find LibXmu.la

2008-01-18 Thread Carsten Kutzner
Hi Andreas, try ./configure --enable-mpi --without-x Carsten Am 18.01.2008 um 09:45 schrieb Andreas Kukol: On SuseLinux 10.3 the command ./configure --enable-mpi works fine but make terminates at this point. Using the option --disable- shared did not change anything. I would be

Re: [gmx-users] MPI issue

2008-01-16 Thread Carsten Kutzner
)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077

[gmx-users] Re: gromacs installation (some broken links in wiki?)

2008-01-03 Thread Carsten Kutzner
/ include directory in the LDFLAGS / CPPFLAGS variables. This should to the trick. Carsten Am 03.01.2008 um 17:56 schrieb mahdi fathi: Dear Dr Carsten Kutzner I want install gromacs on 4 machines at my small lab but I couldnt find clear instructions for parallel instalation on gromacs.org

Re: [gmx-users] different results when using different number cpus

2007-12-05 Thread Carsten Kutzner
at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry

Re: [gmx-users] Re: parallel simulation crash on 6 processors

2007-11-29 Thread Carsten Kutzner
Hi Servaas, I often had similar problems when running on mpich-1.2.x. In my case they all vanished when I was using any other MPI implementation, like LAM, OpenMPI, or mpich-2.x. Carsten servaas michielssens wrote: -- next part -- An HTML attachment was scrubbed...

Re: [gmx-users] Gromacs slow and crashes on Leopard.

2007-11-28 Thread Carsten Kutzner
Hadas Leonov wrote: After installing with openmpi - I ran some benchmarks for 4 processors on Mac-Pro: d.villin: Leopard performance: 13714 ps/day old OS performance:41143 ps/day. gmx-benchmark : 48000 ps/day. d.poly-ch2 Leopard performance: 8640 ps/day old OS

Re: [gmx-users] Barcelona vs Xeon

2007-11-28 Thread Carsten Kutzner
)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077

Re: [gmx-users] Gromacs slow and crashes on Leopard.

2007-11-28 Thread Carsten Kutzner
Hadas Leonov wrote: Great, thanks! It does solve the ia32 compilation problems, but not the lam-mpi compilation problems - I still get undefined symbols for a few lam variables. So still using open-mpi, Gromacs works a little better now, for d.villin benchmark, the performance are: 1

Re: [gmx-users] GROMACS in parallel on a multicore PC?

2007-11-21 Thread Carsten Kutzner
the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical

Re: [gmx-users] compilation fails on MAC OSX - Leopard

2007-11-15 Thread Carsten Kutzner
Hi Hadas, I think the problem is that the make command for some reason thinks it should build the Itanium inner loops: ld: in ../gmxlib/.libs/libgmx_mpi.a(nb_kernel204_ia32_sse.o), in though on your machine the x86_64 kernels should be build. Probably one can set the correct architecture with

Re: [gmx-users] No improvement in scaling on introducing flow control

2007-10-31 Thread Carsten Kutzner
himanshu khandelia wrote: Hi Carsten, The benchmarks were made is 1 NIC/node, and yet the scaling is bad. Does that mean that there is indeed network congestion ? We will try using back to back connections soon, Hi Himanshu, In my opinion the most probable scenario is that the bandwidth of

Re: [gmx-users] No improvement in scaling on introducing flow control

2007-10-25 Thread Carsten Kutzner
search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute

Re: [gmx-users] No improvement in scaling on introducing flow control

2007-10-25 Thread Carsten Kutzner
the benchmark for 8 CPUs. See if you get a different value. Regards, Carsten Thank you, -Himanshu On 10/25/07, Carsten Kutzner [EMAIL PROTECTED] wrote: Hi Himanshu, maybe your problem is not even flow control, but the limited network bandwidth which is shared among 4 CPUs in your

Re: [gmx-users] (no subject)

2007-10-12 Thread Carsten Kutzner
mdrun -s x.tpr Hope that helps, Regards, Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/research/dep

Re: [gmx-users] running gromacs in parallel, and with nice levels

2007-10-05 Thread Carsten Kutzner
level on dedicated nodes, it will not affect the performance. Hope that helps, regards, Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551

Re: [gmx-users] how to get CVS version gmx?

2007-08-17 Thread Carsten Kutzner
Hi Linda, you can use these commands to download from CVS: cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs login Then hit RETURN on password prompt cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs co gmx Carsten Zhaoyang Fu wrote: Dear gmx users developers, Would you

Re: [gmx-users] problems with parallel mdrun

2007-08-09 Thread Carsten Kutzner
Mark Abraham wrote: Gurpreet Singh wrote: I get the following errors while using paralled version of mdrun compiled with openmpi. mpirun -np 4 mdrun_d_mpi -np 4 -v -deffnm EQUI1 * Program mdrun_d_mpi, VERSION 3.3.99_development_20070720 Source code file: gmx_parallel_3dfft.c, line: 90

Re: [gmx-users] rotational fit in XY plane only

2007-06-28 Thread Carsten Kutzner
Hi Abu, with these commands you download the latest CVS version: cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs login return cvs -z3 -d :pserver:[EMAIL PROTECTED]:/home/gmx/cvs co gmx Carsten Naser, Md Abu wrote: I implemented an xy only fitting option for trjconv in the

Re: [gmx-users] Gromacs website broken

2007-06-26 Thread Carsten Kutzner
pim schravendijk wrote: Hi People!, The Gromacs website is seriously broken. I wanted to copy-paste the line for cvs checkout as I usually do, but following the link to the CVS page via either the download or the developer menu gives me a page saying I don't have access and need to log in.

Re: [gmx-users] request for patch and test program from 'Speeding up Parallel GROMACS'

2007-04-04 Thread Carsten Kutzner
-to-all just turned out to be the No. 1 candidate. Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/research

Re: [gmx-users] ordered all-to-all patch

2007-04-03 Thread Carsten Kutzner
Hi Chris, the patch can be downloaded from the gromacs website: download - user contributions - contributed software - gmx_alltoall Carsten [EMAIL PROTECTED] wrote: This is a fantastic development. I had wondered why my scaling was so much better for openmpi than for lam. Has the patch been

Re: [gmx-users] Running GROMACS in parallel

2006-11-09 Thread Carsten Kutzner
requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen

Re: [gmx-users] Problems Installing on IBM Netfinity 4500R with Redhat 8.0

2006-11-02 Thread Carsten Kutzner
://www.gromacs.org/mailman/listinfo/gmx-users Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical

Re: Fwd: [gmx-users] mdrun on several clusternodes using PME gromacs 3.3.1

2006-10-27 Thread Carsten Kutzner
mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck

Re: [gmx-users] GROMACS Parallel Runs

2006-10-02 Thread Carsten Kutzner
-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max

Re: [gmx-users] mdrun hangs on nodes of P655+ aix 5.2

2006-07-31 Thread Carsten Kutzner
)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen

Re: [gmx-users] MPICH or LAM/MPI

2006-06-27 Thread Carsten Kutzner
it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http

Re: [gmx-users] MPICH or LAM/MPI

2006-06-20 Thread Carsten Kutzner
-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck

Re: [gmx-users] MPICH or LAM/MPI

2006-06-19 Thread Carsten Kutzner
computer nodes. Carsten -Original Message- From: Carsten Kutzner [EMAIL PROTECTED] To: Discussion list for GROMACS users gmx-users@gromacs.org Sent: Mon, 19 Jun 2006 10:28:51 +0200 Subject: Re: [gmx-users] MPICH or LAM/MPI Hello Hector, since it does not take long to install lam

<    1   2   3   >