Re: [gmx-users] AVX2 SIMD intrinsics speed boost

2013-07-12 Thread Erik Lindahl
Hi, We will have AVX2 acceleration ready for general usage before the end of July (together with some other goodies), and it will be markedly faster, but until it's ready and tested to give correct results we can't say anything about the final performance. However, in general AVX2 will have th

Re: [gmx-users] Free webinar on Gromacs-4.6 and GPUs together with Nvidia

2013-04-02 Thread Erik Lindahl
nt to ask! Cheers, Erik On Mar 20, 2013, at 5:14 PM, Erik Lindahl wrote: > Hi, > > On April 4 (9-10am, US pacific time), Nvidia is helping us organize a web > seminar about Gromacs-4.6 and all the new GPU capabilities. While I would > expect their representative to spend a cou

[gmx-users] Free webinar on Gromacs-4.6 and GPUs together with Nvidia

2013-03-20 Thread Erik Lindahl
Hi, On April 4 (9-10am, US pacific time), Nvidia is helping us organize a web seminar about Gromacs-4.6 and all the new GPU capabilities. While I would expect their representative to spend a couple of minutes talking about hardware, they will invite interested users to free remote testdrives of

[gmx-users] Server move tonight might cause temporary outage

2012-07-25 Thread Erik Lindahl
he MX record to a new domain and we cannot count on the old machine to forward things. Cheers, Erik -- Erik Lindahl Professor of Theoretical & Computational Biophysics, KTH Professor of Computational Structural Biology, Stockholm University Tel (KTH): +46 8 55378029 Tel (SU): +46 8 164675 Cel

[gmx-users] Results of the GROMACS user survey & NVIDIA board winners

2012-06-01 Thread Erik Lindahl
Computational Physics, University Stuttgart * Emmanuel Birru, Monash University, Parkville Campus Big congratulations to both winners from the GROMACS & NVIDIA teams, and let me convey a *very* big and special thank-you to Mark Berger for making this possible! All the best, Erik -- Erik Lin

[gmx-users] FINAL CHANCE: GROMACS survey closing May 1!

2012-04-30 Thread Erik Lindahl
elp! Erik On Mar 27, 2012, at 5:45 AM, Erik Lindahl wrote: > Hi! > > Big thanks to those of you who already participated in this survey - we got > about 100 *VERY* useful answers this far, but considering the size of the > community I think we should be able to improve that five

[gmx-users] IMPORTANT REMINDER: Gromacs survey (with a chance to win a $2000 GPU)

2012-03-26 Thread Erik Lindahl
, so it is very important for us (and you) to get as representative results as possible! If you haven't already, it would be great if you could take a couple of minutes and participate at https://www.surveymonkey.com/s/YD9ZMJK Cheers, Erik On Mar 3, 2012, at 1:39 AM, Erik Lindahl

[gmx-users] Help us steer future GROMACS development, with a chance to win a $2000+ Tesla C2075 GPU

2012-03-03 Thread Erik Lindahl
This could be a great way to kickstart your GPU simulation usage! Nvidia sweepstake rules: http://www.nvidia.com/object/sweepstakes-official-rules.html All the best, Erik Lindahl & The GROMACS Development team -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman

[gmx-users] Postdoctoral and staff scientist positions in the Gromacs teams

2011-12-01 Thread Erik Lindahl
rcomputing Center, Oak Ridge National Labs, and RIKEN in Kobe where we have access to some of the world’s fastest supercomputers. Submit a PDF application with a CV, description of interests, and a list of publications for postdoctoral positions to Erik Lindahl or Berk Hess ; we are also happy t

[gmx-users] Warning about (short) potential interruption - moving DNS servers for gromacs.org

2011-01-05 Thread Erik Lindahl
Hi, We're in the process of transferring the domain name to a more convenient registrar, and this will also require us to change DNS servers. Thus, you might experience short interruptions in name resolutions for *.gromacs.org next week! Cheers, Erik-- gmx-users mailing listgmx-users@groma

[gmx-users] Major code reorganization in git coming up

2010-12-30 Thread Erik Lindahl
that). The upside of all this is that the file organization will gradually get easier to understand, and the code itself should also get much more modular and readable (which we hope will lead to a faster release schedule and more features :-) Happy new year! Erik -- Erik Lindahl Profess

[gmx-users] Major code reorganization in git coming up

2010-12-30 Thread Erik Lindahl
that). The upside of all this is that the file organization will gradually get easier to understand, and the code itself should also get much more modular and readable (which we hope will lead to a faster release schedule and more features :-) Happy new year! Erik -- Erik Lindahl Profess

[gmx-users] Temporary downtime for git.gromacs.org Wednesday

2010-11-30 Thread Erik Lindahl
Hi, We'll be moving git.gromacs.org to a new server room in a different building tomorrow, so it might be unavailable for a short while tomorrow afternoon European time. Sorry for the inconvenience! Cheers, Erik Sent from my iPhone-- gmx-users mailing listgmx-users@gromacs.org http://lis

[gmx-users] An opportunity to test-drive Gromacs on cutting-edge Nvidia GPUs

2010-08-18 Thread Erik Lindahl
//www.nvidia.com/MD_Test_Drive, and Roy has also said you can contact him directly at r...@nvidia.com Cheers, Erik ------ Erik Lindahl Professor, Computational Structural Biology Center for Biomembrane Research & Swedish e-Science Research

Re: [gmx-users] GPU CUDA version does not support improper dihedrals?

2010-08-13 Thread Erik Lindahl
ing! > Please don't post (un)subscribe requests to the list. Use thewww interface or > send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php > -- Erik Lindahl Professor, Com

Re: [gmx-users] gromacs energies very different from other MD engine for the very same system and conditions

2010-07-28 Thread Erik Lindahl
r errors in functional forms. In fact, he did find one minor program error related to ordering of improper torsions in TPR during his work, but that turned out to be in Amber ;-) Suffice to say, I trust his work a _lot_. Cheers, Erik ------

[gmx-users] Openings for talented programmers/postdocs who'd like to work on simulations

2010-06-19 Thread Erik Lindahl
grammer/staff), and what your plans look like time-wise! I'll be happy to tell you more about the work. Cheers, Erik ------ Erik Lindahl Professor, Computational Structural Biology Center for Biomembrane Research & Swedis

Re: [gmx-users] How to increase the tolerance for conjugate gradient minimization

2010-06-19 Thread Erik Lindahl
nd it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php > -- Erik Lindahl Professor, Computational Structural Biology Center for Biomembrane Research & Swedish e-Science Research Cente

Re: [gmx-users] Gromacs-4.0.6 released

2009-12-06 Thread Erik Lindahl
rik On Dec 6, 2009, at 8:10 PM, Erik Lindahl wrote: > Hi, > > We've just put a maintenance release on the site (find it on www.gromacs.org > or directly on ftp.gromacs.org). > > Basically, this includes a number of minor fixes we've incorporated from > Bugzilla th

[gmx-users] Gromacs-4.0.6 released

2009-12-06 Thread Erik Lindahl
Hi, We've just put a maintenance release on the site (find it on www.gromacs.org or directly on ftp.gromacs.org). Basically, this includes a number of minor fixes we've incorporated from Bugzilla the last few months. In particular, the append issues with files >2GB have been fixed, and 32/64 b

Re: [gmx-users] Scaling problems in 8-cores nodes with GROMACS 4.0x

2009-09-04 Thread Erik Lindahl
Hi, On Sep 3, 2009, at 4:52 AM, Daniel Adriano Silva M wrote: Dear Gromacs users, (all related to GROMACS ver 4.0.x) I am facing a very strange problem on a recently acquired supermicro 8 XEON-cores nodes (2.5GHz quad-core/node, 4G/RAM with the four memory channels activated, XEON E5420, 20Gbs

Re: [gmx-users] Nonrepeatable results for gromacs 4.0.5

2009-06-07 Thread Erik Lindahl
4.9% Any ideas why I am seeing this? Here is the initial mdrun printed input info: :-) G R O M A C S (-: Groningen Machine for Chemical Simulation :-) VERSION 4.0.5 (-: Written by David van der Spoel, Erik Lindahl,

Re: [gmx-users] gromacs on itanium2

2009-03-03 Thread Erik Lindahl
Hi, There have been some reports on newer ia64 processors being quite fast with the Fortran kernels instead (even faster than asm!), so I would try that. This has to do with the brain-dead architecture on ia64. The asm kernels were written for original itanium2 timings, but with the reg

[gmx-users] Gromacs-4.0.4 released

2009-02-17 Thread Erik Lindahl
Hi, This is just a maintenance release - with fixes for a bunch of minor things that have been reported on bugzilla. Find it in the usual place: ftp://ftp.gromacs.org:/pub/gromacs/gromacs-4.0.4.tar.gz Cheers, Erik PS: We haven't forgotten those of you that like binaries, but we're rework

[gmx-users] Gromacs 4.0.3 released

2009-01-18 Thread Erik Lindahl
Hi everybody, We've just released a maintenance version 4.0.3 with a bunch of bugfixes and minor enhancements - all issues in bugzilla that had been settled to be bugs have been fixed. You can find it in the usual place: ftp://ftp.gromacs.org/pub/gromacs/gromacs-4.0.3.tar.gz Some of the f

[gmx-users] Gromacs-4.0.2 is out

2008-11-10 Thread Erik Lindahl
Hi, I think we've fixed all minor issues with Gromacs-4.0 now; please download the new release from ftp://ftp.gromacs.org/pub/gromacs/gromacs-4.0.2.tar.gz Now, a friend of order might ask what happened to 4.0.1 that appeared on the ftp site for a couple of hours on friday? Unfortunately

[gmx-users] Last chance to report bugs if you want them fixed in 4.0.1 :-)

2008-10-15 Thread Erik Lindahl
Hi, As expected there was one or two minor issues with 4.0; I'm planning to create a 4.0.1 release over the weekend or early next week, after which we'll also do binary packages. So, you have one or two days left to report 4.0 hickups to bugzilla and have them fixed rapidly! Cheers, Er

Re: [gmx-users] Running MD only for selected part of molecule

2008-10-12 Thread Erik Lindahl
Hi, On Oct 10, 2008, at 12:46 PM, Justin A. Lemkul wrote: I want to run MD over a part of my molecule , for few residues only (not the whole molecule). Can I do it using GROMACS ? I searched for the online documentation and mailing list, but unable to get appropriate information. If somebo

Re: [gmx-users] Continuing a crashed run when running simulation in parts

2008-10-12 Thread Erik Lindahl
Hi, In Gromacs 3.3 and earlier that depended on whether you were writing velocities & coordinates to the full precision trajectory - we can only restart from a frame where we have that. Starting with Gromacs 4, runs are automatically checkpointed every 15 minutes (by default). Then you ca

Re: [gmx-users] Installation on iMAC

2008-10-12 Thread Erik Lindahl
And if you hold on for 3-4 days we will have a binary disk image package for Gromacs 4.0 available for macs (there is already one for Gromacs 3.3.x) Cheers, Erik On Oct 10, 2008, at 12:42 PM, Justin A. Lemkul wrote: Kwee Hong wrote: Hi. I'm very new to iMAC as I've just started to use

[gmx-users] Announcing: Gromacs-4.0

2008-10-09 Thread Erik Lindahl
So, First the bad news: from now on we're probably not going to care too much about bugs in gromacs-3.3. The good news is that Gromacs 4.0 is finally & officially _released_. You can download the source code package at ftp://ftp.gromacs.org/pub/gromacs/gromacs-4.0.tar.gz We will put docum

Re: [gmx-users] Is the new version of Gromacs going to be able to do implicit water md?

2008-10-09 Thread Erik Lindahl
Hi, Not 4.0, but it's in the pipeline for 4.1. The issue is not doing it, but doing it clearly faster than explicit water MD, since water interactions are extremely optimized in Gromacs! Cheers, Erik On Oct 9, 2008, at 2:38 PM, Arthur Roberts wrote: Hi, all, I know this question might s

[gmx-users] Open position: Researcher / "Gromacs czar" (up to 5yrs)

2008-10-09 Thread Erik Lindahl
n PhD students. To apply, please submit a CV, possibly list of publications, and a brief letter (1 page) describing your background and interest to Erik Lindahl ([EMAIL PROTECTED]). Evaluation of applications will begin november 1 and continue until the position has been filled. The tentative sta

Re: [gmx-users] Problems installing GMX 4.0 RC4

2008-10-09 Thread Erik Lindahl
Hi, Great. I'll just revert to make static libraries the default for 4.0 too. Cheers, Erik On Oct 9, 2008, at 7:55 AM, Justin A. Lemkul wrote: Erik Lindahl wrote: On Oct 9, 2008, at 5:24 AM, Justin A. Lemkul wrote: To follow up just a bit more - it appears that this probl

Re: [gmx-users] Problems installing GMX 4.0 RC4

2008-10-09 Thread Erik Lindahl
On Oct 9, 2008, at 5:24 AM, Justin A. Lemkul wrote: To follow up just a bit more - it appears that this problem is isolated to the PowerPC architecture. I can compile and install RC4 on my own laptop (a recent Intel Mac) with no problems. I think it's the shared libraries in combination

Re: [gmx-users] Problems installing GMX 4.0 RC4

2008-10-09 Thread Erik Lindahl
Hi Justin, This might be due to shared libraries, which I thought always worked on OS X :-) Could you try with the option --disable-shared? I guess the world still might not be ready for default shared libraries... in that case I'll disable it. Cheers, Erik On Oct 9, 2008, at 4:36 AM,

[gmx-users] Final (?) release candidate (4) before Gromacs-4.0

2008-10-08 Thread Erik Lindahl
Hi, I think we have fixed all issues with the release candidates, and I have just put the rc4 version at ftp://ftp.gromacs.org/pub/beta/gromacs-4.0_rc4.tar.gz Please test this to check whether there are any showstopper bugs :-) Although I'm sure there are still some minor issues, our curre

Re: [gmx-users] GROMACS performance on 10 GBit/s or 20Gbit/s infiniband

2008-10-08 Thread Erik Lindahl
Hi, On Oct 6, 2008, at 6:46 AM, Himanshu Khandelia wrote: We are buying a new cluster with 8-code nodes and infiniband, and have a choice between 10 Gbit/s and 20 Gbit/s transfer rates between nodes. I do not immediately see the need for 20GBit/s between nodes, but thought it might be wor

Re: [gmx-users] uniform neutralizing plasma for PME

2008-09-30 Thread Erik Lindahl
Yes. Cheers, Erik On Sep 29, 2008, at 3:18 PM, himanshu khandelia wrote: Is there an implementation in gromacs for using a uniform neutralizing plasma with PME, to avoid use of counterions? Thank you -Himanshu ___ gmx-users mailing listgmx-u

Re: [gmx-users] Re: GROMACS and GPGPU

2008-09-30 Thread Erik Lindahl
Hi, Please search the list - all this stuff will be available in Gromacs once we have neighborlists and PME working, but that's non-trivial work ;-) Cheers, Erik On Sep 30, 2008, at 10:17 AM, Jose Duarte wrote: Hi Tiago I think this is precisely what the [EMAIL PROTECTED] people are do

[gmx-users] Gromacs-4.0, release candidate 2

2008-09-26 Thread Erik Lindahl
Hi, We've corrected some minor things, and one major: there was a pointer not always being updated in the PME loop (new optimization stuff we introduced very late, so CVS versions have been fine until 20080918) which could lead to force errors. So, please point your browsers/ftp-clients t

Re: [gmx-users] Announcing: Gromacs 4.0, release candidate 1

2008-09-22 Thread Erik Lindahl
ou in advance. --- On Sun, 9/21/08, Erik Lindahl <[EMAIL PROTECTED]> wrote: From: Erik Lindahl <[EMAIL PROTECTED]> Subject: [gmx-users] Announcing: Gromacs 4.0, release candidate 1 To: "Discussion list for GROMACS users" Date: Sunday, September 21, 2008, 11:46 PM Stockholm, Se

Re: [gmx-users] Announcing: Gromacs 4.0, release candidate 1

2008-09-22 Thread Erik Lindahl
ort bugs to us. Don't complain if your trajectories are eaten after you've moved them in the middle of a simulation, though ;-) Cheers, Erik On Sep 22, 2008, at 8:46 AM, Erik Lindahl wrote: Stockholm, September 22 2008 In a bold move today, the Gromacs developers finally decided not

[gmx-users] Announcing: Gromacs 4.0, release candidate 1

2008-09-21 Thread Erik Lindahl
Stockholm, September 22 2008 In a bold move today, the Gromacs developers finally decided not to wait for Duke Nukem Forever before releasing Gromacs 4.0, and just put out release candidate 1 together with a new manual at ftp://ftp.gromacs.org:/pub/beta/ "We realize it could be a big disa

[gmx-users] Re: CHARMM FF

2008-07-17 Thread Erik Lindahl
Hi Roland, We have it working with pdb2gmx, but not CMAP yet, but we're working on integrating that. I'll see what I can do about pushing things into CVS when I'm back from vacation in two weeks! Cheers, Erik On Thu, Jul 17, 2008 at 6:57 PM, Roland Schulz <[EMAIL PROTECTED]> wrote: > Hi all, >

Re: [gmx-users] Talk slides from the Gromacs Stanford 2008 workshop

2008-04-25 Thread Erik Lindahl
Hi, Skip the final slash (my bad): http://wiki.gromacs.org/index.php/Stanford_2008_Workhop Or just click your way from the main page. Cheers, Erik On Apr 25, 2008, at 11:54 AM, Mu Yuguang (Dr) wrote: Sorry Eric, I cannot see anything. Regards Yuguang ___

[gmx-users] Talk slides from the Gromacs Stanford 2008 workshop

2008-04-25 Thread Erik Lindahl
Hi, I'm still waiting for a couple of speakers, but I've finally put up PDF copies of the slides from workshop talks. Have a look at http://wiki.gromacs.org/index.php/Stanford_2008_Workhop/ Cheers, Erik ___ gmx-users mailing listgmx-users@grom

Re: [gmx-users] SGI installation problem

2008-03-19 Thread Erik Lindahl
lease search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php Erik Lindahl <[EMA

Re: [gmx-users] SGI installation problem

2008-03-16 Thread Erik Lindahl
Hi, One option would simply be to remove make_edi from Makefile.am. Unfortunately this tool has proven to be a bit buggy in the compile stage, and you won't need it for any normal simulations. The other fix would be to create separate static variables to use in the list for reading option

[gmx-users] Postdoc/staff scientist opportunities: Gromacs & free energy

2008-03-16 Thread Erik Lindahl
-term staff scientist (5 years) or even assistant professorship appointments. Feel free to drop me a line if you're a finishing PhD student or postdoc who might be interested, and I'd be happy if you want to spread the the word to your local students! Cheers

Re: [gmx-users] memory requirements

2008-02-20 Thread Erik Lindahl
e" to simulation data would be a factor ~3 higher in the latter case. Cheers, Erik Erik Lindahl <[EMAIL PROTECTED]> Backup: <[EMAIL PROTECTED]> Assistant Professor, Computational Structural Biology Center for Biomembrane Research, Dept. Biochemistry & Bioph

[gmx-users] Registration open for Stanford workshop April 7 & 8

2008-02-17 Thread Erik Lindahl
Hi, Oops - I happened to only post this to the gmx-developers list first: I have put up a somewhat primitive information & registration page about the Stanford workshop at http://www.gromacs.org/stanford2008/ As I mentioned on friday, the space is somewhat limited (30-35 participants),

Re: [gmx-users] Stanford workshop april 7-8

2008-02-17 Thread Erik Lindahl
there is any, about the workshop that will be held in Göttingen? I am outside of the countries you mentioned :) -Original Message- From: Erik Lindahl <[EMAIL PROTECTED]> To: gmx-users@gromacs.org Date: Fri, 15 Feb 2008 20:13:19 +0100 Subject: [gmx-users] Stanford workshop april 7

[gmx-users] Stanford workshop april 7-8

2008-02-15 Thread Erik Lindahl
Göttingen soon :-) From wednesday any possible remaining slots will be filled on a first- come, first-served, basis. Cheers, Erik -------- Erik Lindahl <[EMAIL PROTECTED]> Backup: <[EMAIL PROTECTED]> Assistant Professor, Computational Structural Biology Center for

Re: [gmx-users] Re: installation problem for MAC OS

2008-02-08 Thread Erik Lindahl
Hi Warner, I think this is a bug in the assembler/linker shipped with Leopard since the instructions are perfectly valid. In any case, I've worked around it in the release branch of CVS. Cheers, Erik On Feb 7, 2008, at 8:52 PM, Warner Yuen wrote: My apologies. I should have known better, b

[gmx-users] Pre-announcement: advanced Gromacs workshop at Stanford April 7-8

2008-02-08 Thread Erik Lindahl
Hi, We've thought of doing an advanced simulation/modeling workshop at Stanford for a while, and finally settled on a date, although with a bit short notice :-) In any case, we will do a semi-advanced workshop at Stanford University in the US on April 7 & 8 with room for 30-35 people. The

Re: [gmx-users] opls parameters in distribution

2007-12-09 Thread Erik Lindahl
Hi David, On Dec 9, 2007, at 3:50 AM, David Mobley wrote: All, This may be more of an OPLS question than a GROMACS question, but I am trying to determine where the OPLS parameters for alkynes (opls_925 through opls_932 in ffoplsaa.atp of the gromacs distribution) come from, and especially the

Re: [gmx-users] stop mailing me

2007-11-14 Thread Erik Lindahl
On Nov 13, 2007, at 11:56 AM, Henry O Ify wrote: please stop mailing me, i am tired of all this mail First, it was you and not we who subscribed you to the list :-) Second, you're more than welcome to unsubscribe. However, you would probably have had better luck if you had read the footer

Re: [gmx-users] Re: Where is the GROMOS87 force field when using pdb2gmx

2007-11-08 Thread Erik Lindahl
Hi, On Nov 7, 2007, at 5:40 PM, maria goranovic wrote: Thanks for the help, David. Actually, I just realized I was trying to decide based on mailing list archives which, in some cases, are 5 years old. My mistake. I will use the 43a2 field for my protein. Is there any standard procedure to com

Re: [gmx-users] time constants for the Parrinello-Rahman/nose-hoover coupling

2007-10-21 Thread Erik Lindahl
Hi, Yes, you should. As I think we write somewhere in the manual or mdp file documentation, for Berendsen coupling the time constant refers to the exponential relaxation time, while for N-H or P-R they correspond to the period of the fluctuations between the system and reservoir. Typicall

Re: [gmx-users] install problem on IBM powerpc AIX 5.3

2007-10-17 Thread Erik Lindahl
MPICC is the environment variable (upper case important). It should be set to the name of your mpi-enabled C compiler. Cheers, ERik On Oct 18, 2007, at 4:40 AM, liu xin wrote: Hi Erick you mean export MPICC=mpcc? ok, I will try that On 10/18/07, Erik Lindahl <[EMAIL PROTECTED]>

Re: [gmx-users] install problem on IBM powerpc AIX 5.3

2007-10-17 Thread Erik Lindahl
Hi, On Oct 17, 2007, at 7:13 PM, liu xin wrote: Thanks for your quick comment David but if I tried ./configure --enable-mpi --prefix=/hpc/gromacsmpi it will complain about cant find MPI compiler, but I've already export mpcc=mpicc Try setting MPICC instead :-) Cheers, Erik

[gmx-users] Postdoc position available (docking, free energy, drug design)

2007-10-15 Thread Erik Lindahl
the opportunity to invite everybody else to post your gromacs- ,or at least MD-, related job ads on the list. Most of us have been in the job-hunting stage even if it was a while ago! Cheers, Erik Lindahl ___ gmx-users mailing listg

Re: [gmx-users] errores during compilation of GROMACS 3.3.2

2007-10-15 Thread Erik Lindahl
] on behalf of Erik Lindahl Sent: Mon 10/15/2007 19:08 To: Discussion list for GROMACS users Subject: Re: [gmx-users] errores during compilation of GROMACS 3.3.2 Hi, Could you try without optimization flags? Go to src/gmxlib/nonbonded/ nb_kernel_ia64_double and type make

Re: [gmx-users] errores during compilation of GROMACS 3.3.2

2007-10-15 Thread Erik Lindahl
blems with the intel one as well. Best, Itamar -Original Message- From: [EMAIL PROTECTED] on behalf of Erik Lindahl Sent: Mon 10/15/2007 17:02 To: Discussion list for GROMACS users Subject: Re: [gmx-users] errores during compilation of GROMACS 3.3.2 Hi Itamar, Not immediately. Did this

Re: [gmx-users] errores during compilation of GROMACS 3.3.2

2007-10-15 Thread Erik Lindahl
Hi Itamar, Not immediately. Did this work fine with 3.3.1? As far as I know we haven't changed anything in the ia64 assembly routines. Cheers, Erik On Oct 15, 2007, at 2:16 AM, Itamar Kass wrote: Dear all, I am trying to compile GROMOCS 3.3.2 on Altix 3700 BX2 (Itanium2 1.6GHz). I am d

Re: [gmx-users] Tool for visualizing disulphide bonds

2007-10-14 Thread Erik Lindahl
Hi, On Oct 13, 2007, at 6:12 PM, Yang Ye wrote: Program like VMD can draw extra bond. You need to get its manual and code a line or two in TCL. Regards, Yang Ye On 10/14/2007 11:38 PM, van Bemmelen wrote: Hi Ozge, I'm not sure what you mean by "visualization". But what about using VMD f

Re: [gmx-users] ngmx error messages

2007-10-14 Thread Erik Lindahl
Hi, On Oct 13, 2007, at 6:04 PM, Sagittarius wrote: I already have some kind of cygwin. Could you please provide me with more details on "install cygwin with some kind of X server" Thank you in advance Actually, if you don't absolutely need ngmx you can just disable the X11-dependent parts

Re: [gmx-users] tabulated potential - why table length is rc+1nm

2007-10-12 Thread Erik Lindahl
Hi, On Oct 12, 2007, at 7:54 PM, Nickle Fan wrote: Dear gmx-users: I am working on incorporating a customized non-bonded interaction into my simulation. I am trying to implementing it through the tabulated interaction potential. The manual says that the potential should be tabulated up t

Re: [gmx-users] 3.3.2 Error when using the mpi option

2007-10-09 Thread Erik Lindahl
Hi, On Oct 9, 2007, at 6:21 PM, Triguero, Luciano O wrote: Hi, I am trying to compile version 3.3.2 in the parallel mode. I use the following options to configure: ./configure --prefix=dir --disable-float -enable-mpi and receive the following error message: checking size of int... configu

Re: [gmx-users] IO problem on Cray XT3 MPP

2007-10-05 Thread Erik Lindahl
HI, On Oct 5, 2007, at 10:04 PM, David van der Spoel wrote: vijaya subramanian wrote: Hi I have been running GROMACS jobs on a Cray XT3 MPP machine. I get limited time, 12 hr, for any given run and find that the .edr file and .log files do not get written to continuously. This is ok if

Re: [gmx-users] New single thread performance numbers

2007-09-17 Thread Erik Lindahl
Hi Jones, On Sep 17, 2007, at 6:25 PM, Jones de Andrade wrote: Still under NDA? Haven't it been lift last sunday? Well, I guess it's because this was pre-release hardware, so it didn't have any expiry date :-) Anyway: I don't know exactly the terms of the NDA, So if you can't answer me

Re: [gmx-users] New single thread performance numbers

2007-09-17 Thread Erik Lindahl
Hi, Yes, the SPEC numbers aren't representative for typical gromacs performance. Technically I'm still on NDA for the benchmarks I ran on Barcelona so I can't share the exact numbers with you, but on average the performance will be pretty much the same _per_clock_ as Intel Clovertown. A

Re: [gmx-users] How to Install GMX V3.3.1 with Intel fortran compiler?

2007-09-13 Thread Erik Lindahl
Hi, Well, I have a much simpler solution - there is no reason whatsoever to use fortran on ia32 or x86-64 since Gromacs will always use the much faster SSE assembly loops. Cheers, Erik On Sep 13, 2007, at 8:44 AM, Diego Enry wrote: This works for me: % vim /etc/ld.conf /opt/intel/mkl

[gmx-users] Re: Measure contour area of bilayer undulation

2007-09-01 Thread Erik Lindahl
Hi, On Aug 31, 2007, at 11:02 PM, Hwankyu Lee wrote: Thank you very much for your suggestion in the gromacs forum, but I may not understand your explanation completely. In the small bilayer, there is no fluctuation, so area/lipid can be calculated based on the cell dimensions. But, in

Re: [gmx-users] Measure contour area of bilayer undulation

2007-08-31 Thread Erik Lindahl
Hi, In principle you can calculate this from equations e.g. in Safran's book "Statistical Thermodynamics of Surfaces, Interfaces, and Membranes". However, when we worked with this a few years ago we ended up in the conclusion that for the properties we were interested in, the "effective

Re: [gmx-users] Clovertown vs Woodcrest gromacs experiences?

2007-08-27 Thread Erik Lindahl
, Ansgar Esztermann wrote: On Fri, Aug 24, 2007 at 09:56:32AM +0200, Erik Lindahl wrote: Hi, Quad-core is the way to go. I recently benchmarked the scaling of the CVS version, and with 8 independent jobs we get 85-97% throughput scaling, depending on the type of simulation. And you get

Re: [gmx-users] Clovertown vs Woodcrest gromacs experiences?

2007-08-24 Thread Erik Lindahl
Hi, Quad-core is the way to go. I recently benchmarked the scaling of the CVS version, and with 8 independent jobs we get 85-97% throughput scaling, depending on the type of simulation. And you get essentially the same performance if you run two jobs using 4 cores each. Even for a single

[gmx-users] Re: Gromacs for AMD Opteron

2007-08-21 Thread Erik Lindahl
Hi, We've supported x86-64 since 2002 or so. There are already handcoded x86-64 kernels in the correspodning x86_64_sse directory. This is just a matter of your system not being recognized correctly as x86-64, but ia32 for some reason. First make sure that you are really setting -m64 in C

Re: [gmx-users] nstlist with vdw only

2007-08-21 Thread Erik Lindahl
Hi, On Aug 21, 2007, at 8:08 AM, Nicolas Schmidt wrote: Would't this mean, that molecules could suddenly appear and lead to a discontinuity in the LJ-potential? Or in what way should I modify the r_vdw according to what rlist? I wanna end with a cut-off of the LJ-potentail around 1.75nm.

Re: [gmx-users] PME or SPME?

2007-08-21 Thread Erik Lindahl
And the reason for this is that nobody has really used lagrangian- interpolation PME since smooth PME appeared in 1995 :-) Cheers, Erik On Aug 21, 2007, at 4:47 AM, Mark Abraham wrote: Jones de Andrade wrote: Hi all. Well, I have some kind of a "didatic" doubt: Ok, from what I can underst

Re: [gmx-users] nstlist with vdw only

2007-08-20 Thread Erik Lindahl
Hi Nicholas, nstlist=1 means you recalculate the neighborlist every single step, which will be quite expensive. Depending on the type of system and temperature you are simulation you probably want to start somewhere around nstlist=10. Cheers, Erik On Aug 20, 2007, at 10:26 PM, Nicolas

[gmx-users] Re: gromacs-gaussian interface compilation - HOW-TO?

2007-08-20 Thread Erik Lindahl
Hi, Please post questions like this to the mailing list, so the answers make it to the archive. For QM/MM, have a look at http://www.mpibpc.mpg.de/groups/grubmueller/start/people/ggroenh/ qmmm.html Unfortunately we are not even allowed to distribute a "diff" file with modifications to G

Re: [gmx-users] how to build up fixed connections? how to simulate no electrostatics?

2007-08-15 Thread Erik Lindahl
Hi Jeroen, Just set the charges to zero - then the program will automatically detect it and use LJ-only loops everywhere. Cheers, Erik On Aug 15, 2007, at 8:12 PM, van Bemmelen wrote: Hi all, As a side note: although I also have some trouble figuring out what Nicolas Schmidt is exactly t

Re: [gmx-users] Suggestion needed for new workstation

2007-08-10 Thread Erik Lindahl
Hi, On Aug 10, 2007, at 4:37 PM, Florian Haberl wrote: option if you're doing really short (minutes ~ hours) simulations. The configuration: *Two 3.0GHz Quad-Core Intel Xeon * *8GB (4 x 2GB) * *500GB 7200-rpm Serial ATA 3Gb/s RAM size shouldn't be an issue, except if you're simulating huge

Re: [gmx-users] publishing with development version of Gromacs

2007-08-09 Thread Erik Lindahl
Hi, As long as you have the exact date you checked it out from CVS it is possible to obtain the same version again. However, bear in mind that CVS _does_ break periodically, and we don't check it like the released version. Before I do anything production-related with CVS I always check th

Re: [gmx-users] Continuation diferent machine.

2007-08-07 Thread Erik Lindahl
t??? So I could continue the job in the new machine??? Best Regards, Anthony On Tuesday 07 August 2007 5:25 pm, Erik Lindahl wrote: Hi Anthony, As long as the version you're continuing with is the same or more recent than the one you started with it should work fine; all gromacs output files ar

Re: [gmx-users] Continuation diferent machine.

2007-08-07 Thread Erik Lindahl
Hi Anthony, As long as the version you're continuing with is the same or more recent than the one you started with it should work fine; all gromacs output files are stored in portable formats and are can be read by newer versions. You are not guaranteed _binary_ identical results, though

Re: [gmx-users] Timestep

2007-08-07 Thread Erik Lindahl
Hi, In principle Gromacs should never just crash with a segmentation fault, but at least give you a (perhaps cryptic) error message and exit somewhat gracefully. As far as I know there is only one exception to this: If you are using tabulated interactions the table can only be of finite s

Re: [gmx-users] how to run parallel jobs in gromacs

2007-08-06 Thread Erik Lindahl
Consult the documentation for your MPI library and queue system. This is not a Gromacs error message - mpirun is complaining that there aren't enough processors to start 4 jobs. Cheers, Erik On Aug 6, 2007, at 1:14 PM, Anupam Nath Jha wrote: i have tried that also but still same error i

Re: [gmx-users] how to run parallel jobs in gromacs

2007-08-06 Thread Erik Lindahl
The -np flag should be specified to the mpirun command (gromacs determines it automatically). It could also be a problem with your MPI library or queue system - I have no idea why it is trying to start 3 processes... Cheers, Erik On Aug 6, 2007, at 12:46 PM, Anupam Nath Jha wrote: Dear

Re: [gmx-users] Parallel Gromacs Benchmarking with Opteron Dual-Core & Gigabit Ethernet

2007-07-24 Thread Erik Lindahl
Hi, I agree with you about the GbE sharing between 4 cores degardes the performance. Fortunately, every Cluster node has two GbE ports. I want to know, can I configure lamd in such a manner that every processor on every node (with two cores) uses one of these ports for its communication purpo

Re: [gmx-users] Parallel Gromacs Benchmarking with Opteron Dual-Core & Gigabit Ethernet

2007-07-23 Thread Erik Lindahl
Hi, I read at Gmx site that the DPPC system composed of 121,856 atoms. I saw the gmx topology files, it seems that Gmx makes data decomposition on input data to run in parallel (in our simulation case using "-np 12" for execution on 3 nodes, the data space for every process is about 10156 ato

Re: [gmx-users] Parallel Gromacs Benchmarking with Opteron Dual-Core & Gigabit Ethernet

2007-07-22 Thread Erik Lindahl
Hi, On Jul 22, 2007, at 6:08 PM, Kazem Jahanbakhsh wrote: mpirun -np 8 mdrun_d -v -deffnm grompp First, when you run in double precision you will communicate exactly twice as much data. Since gigabit ethernet is usually both latency and bandwidth-limiting, you might get better scaling (a

Re: [gmx-users] CVS code needed

2007-07-22 Thread Erik Lindahl
I put a recent copy (friday) at http://lindahl.sbc.su.se/outgoing/aatto/ Cheers, Erik And yes... we should get a nightly-snapshot system running... :-) On Jul 22, 2007, at 4:08 PM, Kumar V wrote: Hai all, Our campus network is behind firewall because of which I cant use CVS pse

Re: [gmx-users] FW: rvdw

2007-07-20 Thread Erik Lindahl
Hi Jian, The cutoff is based on the center of geometry of the "charge group" definitions in your topology. Since we typically use PME nowadays the don't have to be neutral, so "neighborsearching group" would probably be a better name. If each atom is a separate charge group the cutoff is

Re: [gmx-users] Implicit solvent simulation

2007-07-19 Thread Erik Lindahl
Hi, On Jul 19, 2007, at 10:15 AM, Edgar Luttmann wrote: I can't find any usage of the GB parameters as read from the mdp file - neither in the latest release nor the HEAD revision in the CVS. Is there another branch in the CVS you got that code in? No, at least not publicly available atm.

Re: [gmx-users] Re: Install problem

2007-07-19 Thread Erik Lindahl
Hi, On Jul 19, 2007, at 12:52 AM, Ibrahim M. Moustafa wrote: Hi Jhon, I have the same issue with GROMACS on Mac PPC 10.4. The program executes only with the full path specified. I posted a similar question before, you can check the archive, but got no solution for the problem. What I di

Re: [gmx-users] Implicit solvent simulation

2007-07-19 Thread Erik Lindahl
Although I realize it is the holiday season, at least in Europe, I would like to once more recommend the development wiki for documenting this kind of discussion, and for publishing specifications for algorithms, like (totally unrelated, but came up earlier today): http://wiki.gromacs.

Re: [gmx-users] Re: Parallel Use of gromacs - parallel run is 20 times SLOWER than single node run (Jim Kress)

2007-07-18 Thread Erik Lindahl
Hi Jim, That system should scale quite well with a fast network (infiniband) since you don't seem to be using PME, and the CVS 3.99 version is even better. However, as David already mentioned, the gigabit network is likely limiting you. If you test the CVS code there are timing reports th

  1   2   3   >