Re: [gmx-users] GPU-gromacs

2013-10-25 Thread Carsten Kutzner
be requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg

Re: [gmx-users] Output pinning for mdrun

2013-10-24 Thread Carsten Kutzner
tools that check and output the process placement for a dummy parallel job, or environment variables like MP_INFOLEVEL for loadleveler. Thanks! Carsten > Mark > > > > On Thu, Oct 24, 2013 at 11:44 AM, Carsten Kutzner wrote: > >> Hi, >> >> can one ou

[gmx-users] Output pinning for mdrun

2013-10-24 Thread Carsten Kutzner
Hi, can one output how mdrun threads are pinned to CPU cores? Thanks, Carsten -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don'

Re: [gmx-users] parallelization

2013-10-17 Thread Carsten Kutzner
x27;t post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Bioph

Re: [gmx-users] MPI runs on a local computer

2013-09-20 Thread Carsten Kutzner
ion. > -- > gmx-users mailing listgmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use

Re: [gmx-users] performance issue with the parallel implementation of gromacs

2013-09-19 Thread Carsten Kutzner
4 > 14 2 -1( 16) 183.084 2.2030.9451.188 > 8 6 3 > > > Best performance was achieved with 16 PME nodes (see line 2) > and original PME settings. > Please use this command line to l

Re: [gmx-users] question about installation parameters

2013-09-16 Thread Carsten Kutzner
Hi, On Sep 16, 2013, at 11:23 AM, mjyang wrote: > Dear GMX users, > > > I have a question about the combination of the installation parameters. I > compiled the fftw lib with --enable-sse2 and configured the gromacs with > "cmake .. -DGMX_CPU_ACCELERATION=SSE4.1". I'd like to know if it

Re: [gmx-users] Umbrella sampling simulations using make_edi with -restrain and -harmonic

2013-09-10 Thread Carsten Kutzner
ve at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- D

Re: [gmx-users] Rotation Constraints - PMF - external potential

2013-07-26 Thread Carsten Kutzner
s" structures), and I'd like make an Umbrella Sampling >>> calculation in order to study the PMF varying the distance between A and B. >>>>> >>>>> My problem is that I'd like fix the orientation of the axis of each >>> structure A a

Re: [gmx-users] Rotation Constraints - PMF - external potential

2013-07-25 Thread Carsten Kutzner
e in the z direction. >>> >>> It this possible? If it is so, could you tell me how to do that? >>> Than you very much, >>> Anna >>> >>> >>> >>> >>> >>> >>> >>> >>> &

Re: [gmx-users] Rotation Constraints - PMF + rerun

2013-07-24 Thread Carsten Kutzner
>>> >>> It this possible? If it is so, could you tell me how to do that? >>> Than you very much, >>> Anna >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >&

Re: [gmx-users] Rotation Constraints - PMF

2013-07-23 Thread Carsten Kutzner
hive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists --

Re: [gmx-users] Intel compiling failed

2013-04-05 Thread Carsten Kutzner
On Apr 5, 2013, at 4:21 PM, Albert wrote: > On 04/05/2013 12:38 PM, Carsten Kutzner wrote: >> Hi Albert, >> >> one reason for the error you see could be that you are using a non-Intel >> MPI compiler wrapper. I think you need to specify MPICC=mpiicc as well. >>

Re: [gmx-users] Intel compiling failed

2013-04-05 Thread Carsten Kutzner
On Apr 5, 2013, at 12:52 PM, Justin Lemkul wrote: > > > On 4/5/13 6:38 AM, Carsten Kutzner wrote: >> Hi Albert, >> >> one reason for the error you see could be that you are using a non-Intel >> MPI compiler wrapper. I think you need to specify MPICC=mpii

Re: [gmx-users] Intel compiling failed

2013-04-05 Thread Carsten Kutzner
before posting! > * Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theo

Re: [gmx-users] 4.6.1 support double precision GPU now?

2013-04-02 Thread Carsten Kutzner
On Apr 2, 2013, at 5:47 PM, Albert wrote: > Hello: > > I am wondering is double precision supported in current 4.6.1 GPU version? > Otherwise it would be very slow to use CPU version running free energy > calculations…. Hi Albert, no, GPU calculations can be done only in single precision. B

Re: [gmx-users] g_tune_pme can't be executed

2013-03-21 Thread Carsten Kutzner
; * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/

Re: [gmx-users] Mismatching number of PP MPI processes and GPUs per node

2013-03-11 Thread Carsten Kutzner
listgmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send

Re: [gmx-users] Problem with OpenMP+MPI

2013-02-27 Thread Carsten Kutzner
tp://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten K

Re: [gmx-users] compiling on different architecture than the compute nodes architecture

2013-02-06 Thread Carsten Kutzner
Hi, On Feb 6, 2013, at 6:03 PM, Richard Broadbent wrote: > Dear All, > > I would like to compile gromacs 4.6 to run with the correct acceleration on > the compute nodes on our local cluster. Some of the nodes have intel > sandy-bridge whilst others only have sse4.1 and some (including the lo

Re: [gmx-users] voltage for membrane?

2012-12-24 Thread Carsten Kutzner
On Dec 23, 2012, at 11:23 PM, Martin Hoefling wrote: > You can have a look at http://www.ncbi.nlm.nih.gov/pubmed/21843471 , > maybe that does what you want. On http://www.mpibpc.mpg.de/grubmueller/compel you will find installation instructions for the special gromacs version that support the ab

Re: [gmx-users] Essential dynamics (ED) sampling using make_edi

2012-12-12 Thread Carsten Kutzner
http://lists.gromacs.org/mailman/listinfo/gmx-users > * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. >

Re: [gmx-users] g_tune_pme for multiple nodes

2012-12-04 Thread Carsten Kutzner
mally fastest unless you could have all PME processes exclusively on a single node. Carsten > > Chandan > > > -- > Chandan kumar Choudhury > NCL, Pune > INDIA > > > On Tue, Dec 4, 2012 at 6:39 PM, Carsten Kutzner wrote: > >> Hi Chandan, >>

Re: [gmx-users] g_tune_pme for multiple nodes

2012-12-04 Thread Carsten Kutzner
> tuned for 24 ppn spanning across the two nodes. > > Chandan > > > -- > Chandan kumar Choudhury > NCL, Pune > INDIA > > > On Thu, Nov 29, 2012 at 8:32 PM, Carsten Kutzner wrote: > >> Hi Chandan, >> >> On Nov 29, 2012, at 3:30 PM, Chanda

Re: [gmx-users] g_tune_pme for multiple nodes

2012-11-29 Thread Carsten Kutzner
e > $g_tune_pme_4.5.5 -np 12 -s md0-200.tpr -c tune.pdb -x tune.xtc -e tune.edr > -g tune.log -nice 0 > g_tune_pme executes on the head node and writes various files. > > Kindly let me know what am I missing when I submit through qsub. > > Thanks > > Chandan > -- > C

Re: [gmx-users] Re: Question about scaling

2012-11-13 Thread Carsten Kutzner
> 1) Forces > 2) Neighbor search (ok, going from 2cores to 4cores does not make a big > differences, but from 1core to 2 or 4 saves much time) > > For GMX 4.0.7 ist looks similar, whereas the difference between 2 and 4 cores > is not so high as for GMX 4.5.5 > > Is there an

Re: [gmx-users] Question about scaling

2012-11-12 Thread Carsten Kutzner
ch the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Re: [gmx-users] GROMACS with different gcc and FFT versions but one unique *tpr file

2012-11-08 Thread Carsten Kutzner
/mailman/listinfo/gmx-users > * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@gromacs.org. > * Can't po

Re: [gmx-users] Regarding g_tune_pme optimization

2012-11-01 Thread Carsten Kutzner
posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theo

Re: [gmx-users] too much warnings and notes

2012-10-29 Thread Carsten Kutzner
Hi, On Oct 29, 2012, at 3:57 PM, Albert wrote: > On 10/29/2012 03:56 PM, Carsten Kutzner wrote: >> Hi, >> >> find the reason for the warnings in your mdp file settings >> and adjust them accordingly. >> >> You can also override the warnings

Re: [gmx-users] too much warnings and notes

2012-10-29 Thread Carsten Kutzner
org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck

Re: [gmx-users] Ion conduction through a protein-membrane system

2012-10-02 Thread Carsten Kutzner
archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Re: [gmx-users] how to optimize performance of IBM Power 775?

2012-09-03 Thread Carsten Kutzner
ilman/listinfo/gmx-users > * Only plain text messages are allowed! > * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@g

Re: [gmx-users] g_tune_pme for multiple nodes

2012-09-03 Thread Carsten Kutzner
upport/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute

Re: [gmx-users] g_tune_pme cannot be executed

2012-09-03 Thread Carsten Kutzner
@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > * Only plain text messages are allowed! > * Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > * Please don't post (un)subscribe requests to the list. Use the

Re: [gmx-users] how to run g_tune_pme in cluster?

2012-04-26 Thread Carsten Kutzner
h > > On 04/26/2012 09:53 AM, Carsten Kutzner wrote: >> Hi, >> >> what output does g_tune_pme provide? What is in "log" and in >> "perf.out"? >> Can it find the correct mdrun / mpirun executables? >> >> Carsten >> >>

Re: [gmx-users] how to run g_tune_pme in cluster?

2012-04-26 Thread Carsten Kutzner
Hi, what output does g_tune_pme provide? What is in "log" and in "perf.out"? Can it find the correct mdrun / mpirun executables? Carsten On Apr 26, 2012, at 9:28 AM, Albert wrote: > Hello: > Does anybody have any idea how to run g_tune_pme in a cluster? I tried many > times with following co

Re: [gmx-users] Error: 4095 characters, fgets2 has size 4095

2012-04-10 Thread Carsten Kutzner
Hi Steven, you might have to remove files with weird names (such as ._name) in the directory where you run grompp or in your forcefield directory. Carsten On Apr 10, 2012, at 11:52 AM, Steven Neumann wrote: > Dear Gmx Users, > > It is 1st time I came across such problem. While preparing my

Re: [gmx-users] another g_tune_pme problem

2012-04-02 Thread Carsten Kutzner
On Apr 1, 2012, at 8:01 PM, Albert wrote: > Hello: > I am trying to test g_tune_pme in workstation by command: > > g_tune_pme_d -v -s md.tpr -o bm.trr -cpi md.cpt -cpo bm.cpt -g bm.log -launch > -nt 16 & > > but it stopped immediately with following logs. I complied gromacs with a -d > in e

Re: [gmx-users] Scaling/performance on Gromacs 4

2012-02-20 Thread Carsten Kutzner
Hi Sara, my guess is that 1500 steps are not at all sufficient for a benchmark on 64 cores. The dynamic load balancing will need more time to adapt the domain sizes for optimal balance. It is also important that you reset the timers when the load is balanced (to get clean performance numbers);

Re: [gmx-users] Is there a way to omit particles with, q=0, from Coulomb-/PME-calculations?

2012-01-17 Thread Carsten Kutzner
Hi Thomas, Am Jan 17, 2012 um 10:29 AM schrieb Thomas Schlesier: > But would there be a way to optimize it further? > In my real simulation i would have a charged solute and the uncharged solvent > (both have nearly the same number of particles). If i could omit the > uncharged solvent from the

Re: [gmx-users] modify the gromacs4.5.5 code: using cout

2011-11-30 Thread Carsten Kutzner
Use fprintf(stdout, "…"); Carsten On Nov 30, 2011, at 12:27 PM, 杜波 wrote: > dear teacher, > > i want to modify the gromacs4.5.5 code ,can i use function "cout" which is > introuduce in c++. > > i add the code > #include > #include > at the head of the md.c > > but when i mak

Re: [gmx-users] do_dssp segmentation fault

2011-11-23 Thread Carsten Kutzner
, do a ./bootstrap before the configure step. Best, Carsten > > Bests > > Da: Carsten Kutzner > A: Alex Jemulin ; Discussion list for GROMACS users > > Inviato: Martedì 22 Novembre 2011 13:25 > Oggetto: Re: [gmx-users] do_dssp segmentation fault > > Dear Alex, &g

Re: [gmx-users] do_dssp segmentation fault

2011-11-22 Thread Carsten Kutzner
Dear Alex, On Nov 22, 2011, at 9:28 AM, Alex Jemulin wrote: > Dear all > I'm experiencing the following error in Gromacs 4.5 with do_dssp > > Here is the command > do_dssp -f md.xtc -s md.tpr -o secondary-structure.xpm -sc > secondary-structure.xvg -dt 10 > > give me the following error > se

Re: [gmx-users] suggestion that mdrun should ensure npme < the numberof processes

2011-08-17 Thread Carsten Kutzner
Hi, On Aug 17, 2011, at 1:24 AM, wrote: > Currently, gromacs4.5.4 gives a segfault if one runs mpirun -np 8 mdrun_mpi > -npme 120 with no warning of the source of the problem. > > Obviously npme>nnodes is a bad setup, but a check would be nice. cr->npmenodes is set in mdrun.c right after the

Re: [gmx-users] g_tune_pme

2011-07-28 Thread Carsten Kutzner
ling listgmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send i

Re: [gmx-users] GROMACS 4.5.1 mdrun re-compile for MPI

2011-03-24 Thread Carsten Kutzner
;t post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am

Re: [gmx-users] New maintenance release: gromacs-4.5.4

2011-03-22 Thread Carsten Kutzner
Hi, try to add the --disable-shared flag to your invocation of .configure. Carsten On Mar 22, 2011, at 3:26 PM, Ye MEI wrote: > Thank you for the new version of gromacs. > But the compilation of gromacs failed on my computer. The commands are as > follows: > make distclean > export CC=icc > e

Re: [gmx-users] Performance in ia64 and x86_64

2011-02-25 Thread Carsten Kutzner
Hello Ignacio, On Feb 25, 2011, at 10:25 AM, Ignacio Fernández Galván wrote: > Well, I've compiled mdrun with MPI (with fortran kernels in the ia64), and > run > my test system in both machines, with a single processor. The results are > still > worrying (to me). This is a 50 time step (0.

Re: [gmx-users] GROMACS installation query

2011-02-23 Thread Carsten Kutzner
On Feb 23, 2011, at 6:16 AM, Tom Dupree wrote: > Greetings all, > > I am new to Linux and wish to confirm some facts before I press on with the > installation. > > In the installation guide, > http://www.gromacs.org/Downloads/Installation_Instructions > There is a line saying “...Where as

Re: [gmx-users] configure: error: cannot compute sizeof (off_t)

2011-02-21 Thread Carsten Kutzner
macs.org/mailman/listinfo/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post

Re: [gmx-users] Performance in ia64 and x86_64

2011-02-11 Thread Carsten Kutzner
o/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gro

Re: [gmx-users] Re:General MD question

2011-02-02 Thread Carsten Kutzner
On Feb 2, 2011, at 11:48 AM, lloyd riggs wrote: > Dear Carsten Kutzner, > > First off, thanks. I did not specify it in the input md.mdp file, but when I > looked at the generated out.mdp it had a Linear center of mass removal for > the groups [system]. > > When I add

Re: [gmx-users] segmentation fault: g_velacc

2011-02-01 Thread Carsten Kutzner
Hi Vigneshwar, the problem is fixed now in the release-4-0-patches branch. Carsten On Feb 1, 2011, at 2:00 PM, Carsten Kutzner wrote: > Hi, > > apparently this bug fix made it to 4.5, but not to 4.0. > I will apply the fix also there. > > Carsten > > On Feb 1, 20

Re: [gmx-users] segmentation fault: g_velacc

2011-02-01 Thread Carsten Kutzner
/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www

Re: [gmx-users] General MD question

2011-02-01 Thread Carsten Kutzner
ost (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fass

Re: [gmx-users] Re: Gromacs + GPU: Problems running dppc example in ftp://ftp.gromacs.org/pub/benchmarks/gmxbench-3.0.tar.gz

2011-01-27 Thread Carsten Kutzner
Hi Camilo, On Jan 27, 2011, at 7:19 AM, Camilo Andrés Jimenez Cruz wrote: > Sorry, abrupt sending, > > the coulombtype is the same > > coulombtype = cut-off Is your cut-off actually 0.0 then? Carsten > > and constraints = all-bonds is the same. Any idea? > > 2011/1/27 Cami

Re: [gmx-users] 4 x Opteron 12-core or 4 x Xeon 8-core ?

2011-01-20 Thread Carsten Kutzner
n't post (un)subscribe requests to the list. Use thewww interface or > send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and C

Re: [gmx-users] gromacs 4.5.3 with threads instead of MPI

2011-01-17 Thread Carsten Kutzner
arch before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemist

Re: [gmx-users] g_tune_pme big standard deviation in perf.out output

2011-01-01 Thread Carsten Kutzner
if the questions are not made clear. > Thank you. > > Best, > Yanbin > -- > gmx-users mailing listgmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search befor

Re: [gmx-users] stopping mdrun without error massage

2010-12-22 Thread Carsten Kutzner
P.O.box 13145-1384 > Tehran > Iran > http://www.ibb.ut.ac.ir/ > -- > gmx-users mailing listgmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting

Re: [gmx-users] how to add a electric field to the water box when do a simulation

2010-12-20 Thread Carsten Kutzner
On Dec 20, 2010, at 12:09 PM, 松啸天 wrote: > dear: >I would like to use the electric field inside a box defined by gromacs. > So I added E_x 1 10 0in the .mdp file, is it the right approach? Yes, this will add an electric field of strength 10 V/nm acting in x-direction. Carsten > i hope p

Re: [gmx-users] pullx.xvg / pullf.xvg

2010-12-16 Thread Carsten Kutzner
On Dec 16, 2010, at 2:23 PM, Poojari, Chetan wrote: > Hi, > > Following were the commands which i used in the umbrella sampling simulations: > > grompp -f md_umbrella.mdp -c conf500.gro -p topol.top -n index.ndx -o > umbrella500.tpr > > mdrun -v -deffnm umbrella500 > > > Output:umbrella5

Re: [gmx-users] g-WHAM

2010-12-14 Thread Carsten Kutzner
acs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Pla

Re: [gmx-users] problem building Gromacs 4.5.3 using the Intel compiler

2010-12-13 Thread Carsten Kutzner
Hi, you might also need to use the mpiicc compiler wrapper instead of the mpicc to enforce using icc instead of gcc. Carsten On Dec 13, 2010, at 2:20 PM, Mark Abraham wrote: > > > On 12/13/10, "Miah Wadud Dr (ITCS)" wrote: >> Hello, >> >> I am trying to build Gromacs 4.5.3 using the Intel c

[gmx-users] Re: trouble parallelizing a simulation over a cluster

2010-12-08 Thread Carsten Kutzner
On Dec 8, 2010, at 1:03 PM, Hassan Shallal wrote: > Thanks a lot Justin for the very helpful answers concerning the pressure > equilibration. Using Berendsen Barostat over 200 ps has lead to the correct > average pressure... > > I have another issue to discuss with you and with the Gromacs mai

Re: [gmx-users] How to suppress the error "X particles communicated to PME node Y are more than a cell length out of the domain decomposition cell of their charge group"

2010-12-02 Thread Carsten Kutzner
/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- Dr. Carsten Kutzner Max Planck Institute

Re: [gmx-users] Re: Failed to lock: pre.log (Gromacs 4.5.3)

2010-11-26 Thread Carsten Kutzner
Hi, as a workaround you could run with -noappend and later concatenate the output files. Then you should have no problems with locking. Carsten On Nov 25, 2010, at 9:43 PM, Baofu Qiao wrote: > Hi all, > > I just recompiled GMX4.0.7. Such error doesn't occur. But 4.0.7 is about 30% > slower t

Re: [gmx-users] installation

2010-11-24 Thread Carsten Kutzner
On Nov 24, 2010, at 4:04 PM, Rossella Noschese wrote: > Hi all, I'm trying to install gromacs.4.5.3 on fedora 13. > I followed the instruction on the website, I completed my make install and > this was the output: > GROMACS is installed under /usr/local/gromacs. > Make sure to update your PATH

Re: [gmx-users] unexpexted stop of simulation

2010-11-03 Thread Carsten Kutzner
Hi, there was also an issue with the locking of the general md.log output file which was resolved for 4.5.2. An update might help. Carsten On Nov 3, 2010, at 3:50 PM, Florian Dommert wrote: > -BEGIN PGP SIGNED MESSAGE- > Hash: SHA1 > > On 11/03/2010 03:38 PM, Hong, Liang wrote: >> Dea

Re: [gmx-users] Fwd: -pbc nojump

2010-10-27 Thread Carsten Kutzner
On Oct 27, 2010, at 10:05 AM, leila karami wrote: > Hi Carsten > > Thanks for your answer. You got my case very well. > > I understand your mean as follows: > > 1) Trjconv –f a.xtc –s a.tpr –o b.xtc –pbc mol (output group=water) > > 2) Trjconv –f a.xtc –s a.tpr –o c.xtc –pbc noju

Re: [gmx-users] Fwd: -pbc nojump

2010-10-27 Thread Carsten Kutzner
> Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/Suppo

Re: [gmx-users] g_dipole ? =>salt-molecule => Does Gromacs consider counter ions?

2010-10-22 Thread Carsten Kutzner
On Oct 22, 2010, at 4:14 PM, Chih-Ying Lin wrote: > > Hi > Sorry, I ask the same question again because i am not a decent person in this > field. > If possible, someone can give me a quick answer while i am trying to get > understanding the source codes. > My basic understanding is that Gromacs

Re: [gmx-users] Gromacs 4.5.1 on 48 core magny-cours AMDs

2010-10-21 Thread Carsten Kutzner
On Oct 21, 2010, at 4:44 PM, Sander Pronk wrote: > > Thanks for the information; the OpenMPI recommendation is probably because > OpenMPI goes to great lengths trying to avoid process migration. The numactl > doesn't prevent migration as far as I can tell: it controls where memory gets > alloc

Re: [gmx-users] Gromacs 4.5.1 on 48 core magny-cours AMDs

2010-10-21 Thread Carsten Kutzner
18) 1218.834 12.9310.884OK. > Sander > > On 21 Oct 2010, at 12:03 , Carsten Kutzner wrote: > >> Hi, >> >> does anyone have experience with AMD's 12-core Magny-Cours >> processors? With 48 cores on a node it is essential that the proc

[gmx-users] Gromacs 4.5.1 on 48 core magny-cours AMDs

2010-10-21 Thread Carsten Kutzner
still seem to be migrating around. Carsten -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp/ck

Re: [gmx-users] Error on install Gromacs 4

2010-10-20 Thread Carsten Kutzner
On Oct 20, 2010, at 5:17 AM, Son Tung Ngo wrote: > Dear experts, > > I have just install gromacs 4.5.1 on my cluster (using CentOS that was > install openmpi1.5, Platform MPI, fftw3, g77, gcc , g++) but I have problem > with size of int : > > [r...@icstcluster gromacs-4.5.1]# ./configure --pr

Re: [gmx-users] MPI and dual-core laptop

2010-09-28 Thread Carsten Kutzner
ers@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at > http://www.gromacs.org/Support/Mailing_Lists/Search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-

[gmx-users] Re: efficient use of pme with gromacs

2010-09-22 Thread Carsten Kutzner
t; > Fax: +55 (16) 36024838 > Fone: +55 (16) 3602-3688/ 3602-4372 > e-mail: l...@obelix.ffclrp.usp.br > l...@ffclrp.usp.br > > > http://obelix.ffclrp.usp.br > > > > > > ________

Re: [gmx-users] git gromacs

2010-09-07 Thread Carsten Kutzner
Hi Alan, 'bleeding edge' gromacs development is as always in the 'master' branch. The latest bugfixes for the 4.5.x versions you are going to find in the 'release-4-5-patches' branch. Carsten On Sep 7, 2010, at 12:09 PM, Alan wrote: > Hi there, > > Now that gromacs 4.5.1 is released I was won

Re: [gmx-users] Problem with installing mdrun-gpu in Gromacs-4.5

2010-09-02 Thread Carsten Kutzner
t;> -- Checking for MSVC x86 inline asm >> -- Checking for MSVC x86 inline asm - not supported >> -- Checking for system XDR support >> -- Checking for system XDR support - present >> -- Using internal FFT library - fftpack >> CMake Warning (dev) at CMakeLists.txt:6

Re: [gmx-users] help with git

2010-08-24 Thread Carsten Kutzner
archive at http://www.gromacs.org/search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institut

Re: [gmx-users] mpi installation problems

2010-07-29 Thread Carsten Kutzner
/3.1/lib64/libmpigf.so: undefined reference to > `PMPI_Add_error_code' > /opt/intel/impi/3.1/lib64/libmpigf.so: undefined reference to > `PMPI_Comm_call_errhandler' > /opt/intel/impi/3.1/lib64/libmpigf.so: undefined reference to > `PMPI_Type_get_extent' > /opt/

Re: [gmx-users] decomposition

2010-07-26 Thread Carsten Kutzner
Hi Jacopo, from somewhere the information about the 7 nodes must have come. What are the exact commands you used? What MPI are you using? Carsten On Jul 26, 2010, at 12:35 PM, Jacopo Sgrignani wrote: > Dear all > i'm trying to run a MD simulation using domain decomposition but after two > days

Re: [gmx-users] Installing gromacs from git

2010-07-10 Thread Carsten Kutzner
> > > Can't post? Read http://www.gromacs.org/mailing_lists/users.php > > -- > >gmx-users mailing listgmx-users@gromacs.org ><mailto:gmx-users@gromacs.org> > >http://lists.gromacs.org/mailman/listinfo/gmx-us

Re: [gmx-users] P4_error for extending coarse grained MD simulations

2010-07-09 Thread Carsten Kutzner
s, but there must be some more diagnostic information from mdrun about what has gone wrong. Please check stderr / stdout output files as well as md.log. Carsten > > Justin Zhang > > 在 2010年7月9日 下午4:00,Carsten Kutzner 写道: > Hi Justin, > > what kind of error message do you ge

Re: [gmx-users] P4_error for extending coarse grained MD simulations

2010-07-09 Thread Carsten Kutzner
gmx-users@gromacs.org > > http://lists.gromacs.org/mailman/listinfo/gmx-users > > Please search the archive at http://www.gromacs.org/search > > before posting! > > Please don't post (un)subscribe requests to the list. Use the > > www interface or send it to gmx

Re: [gmx-users] mdrun_mpi: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory

2010-07-08 Thread Carsten Kutzner
e posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoreti

Re: [gmx-users] mpi run

2010-07-08 Thread Carsten Kutzner
the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 370

[gmx-users] Fwd: missing atom

2010-07-03 Thread Carsten Kutzner
Dear Abdul, please keep all Gromacs-related questions on the mailing list. Best, Carsten Begin forwarded message: > From: "abdul wadood" > Date: July 3, 2010 8:40:29 AM GMT+02:00 > To: > Subject: missing atom > > Dear Carsten > > I am running simulation using gromacs with amber forcefie

Re: [gmx-users] the job is not being distributed

2010-06-30 Thread Carsten Kutzner
-users > Please search the archive at http://www.gromacs.org/search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr

Re: [gmx-users] ED sampling

2010-06-30 Thread Carsten Kutzner
iling listgmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at http://www.gromacs.org/search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gr

Re: [gmx-users] the job is not being distributed

2010-06-28 Thread Carsten Kutzner
/www.gromacs.org/search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute fo

Re: [gmx-users] the job is not being distributed

2010-06-28 Thread Carsten Kutzner
On Jun 28, 2010, at 2:34 PM, Syed Tarique Moin wrote: > hello, > > I am running a simulation on dual core processor using the following command > > mpirun -np 8 mdrun_mpi -s top > > The job is running but it is not distributed on other node, i mean i cant see > the process on other nodes as we

Re: [gmx-users] Should I use separate PME nodes

2010-06-25 Thread Carsten Kutzner
search the archive at http://www.gromacs.org/search before posting! > Please don't post (un)subscribe requests to the list. Use the > www interface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzn

Re: [gmx-users] Re: gmx-users Digest, Vol 74, Issue 134

2010-06-24 Thread Carsten Kutzner
ot;Re: Contents of gmx-users digest..." > > > Today's Topics: > >1. (no subject) (Amin Arabbagheri) >2. Re: (no subject) (Justin A. Lemkul) >3. Re: (no subject) (Linus ?stberg) >4. Re: (no subject) (Carsten Kutzner) >5. Help with defining new

Re: [gmx-users] (no subject)

2010-06-21 Thread Carsten Kutzner
nterface or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-201

[gmx-users] Re: Parallel instalation: gmx-users Digest, Vol 74, Issue 76

2010-06-18 Thread Carsten Kutzner
the lincs warning threshold > in your mdp file > 3: or set the environment variable GMX_MAXCONSTRWARN to -1, > 3: but normally it is better to fix the problem > 3: --- > 3: > Maybe your system is not well equilibrated, or your time step is too long. > > Carsten

[gmx-users] Re: Parallel instalation: gmx-users Digest, Vol 74, Issue 76

2010-06-17 Thread Carsten Kutzner
On Jun 17, 2010, at 3:42 PM, abdul wadood wrote: > Dear Carsten > > the command which i give is > > mpiexec -l -np 4 /usr/local/gromacs/bin/mdrun_mpi -s topol.tpr > > with this command the same error come which is > > Can not open file: > 3: topol.tpr > 3: --

Re: [gmx-users] Parallel installation

2010-06-14 Thread Carsten Kutzner
x. > Learn more. -- > gmx-users mailing listgmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at http://www.gromacs.org/search before posting! > Please don't post (un)subscribe requests to the list. Use the > www int

  1   2   3   >