Re: {Spam?} Re: [gmx-users] Re: wierd behavior of mdrun

2009-09-08 Thread Justin A. Lemkul
am Sent: August 27, 2009 3:32 PM To: Discussion list for GROMACS users Subject: Re: [gmx-users] Re: wierd behavior of mdrun Vitaly V. Chaban wrote: Then I believe you have problems with MPI. Before I experienced something alike on our old system - serial version worked OK but parallel one failed. The

Re: {Spam?} Re: [gmx-users] Re: wierd behavior of mdrun

2009-09-08 Thread Paymon Pirzadeh
gt; >>> ;epsilon_r= 1 > >>> > >>> ; Vdw > >>> > >>> vdwtype = cut-off > >>> rvdw = 1.2 > >>> DispCorr = EnerPres > >>> > >>> ;Ewald > >>> > >>> fourierspacing =

RE: {Spam?} Re: [gmx-users] Re: wierd behavior of mdrun

2009-09-05 Thread Payman Pirzadeh
[mailto:gmx-users-boun...@gromacs.org] On Behalf Of Justin A. Lemkul Sent: Friday, September 04, 2009 5:29 PM To: Gromacs Users' List Subject: {Spam?} Re: [gmx-users] Re: wierd behavior of mdrun Have you tried my suggestion from the last message of setting frequent output? Could your system ju

Re: [gmx-users] Re: wierd behavior of mdrun

2009-09-05 Thread Justin A. Lemkul
Behalf Of Mark Abraham Sent: August 27, 2009 3:32 PM To: Discussion list for GROMACS users Subject: Re: [gmx-users] Re: wierd behavior of mdrun Vitaly V. Chaban wrote: Then I believe you have problems with MPI. Before I experienced something alike on our old system - serial version worked OK but para

Re: [gmx-users] Re: wierd behavior of mdrun

2009-09-04 Thread Mark Abraham
August 27, 2009 3:32 PM To: Discussion list for GROMACS users Subject: Re: [gmx-users] Re: wierd behavior of mdrun Vitaly V. Chaban wrote: Then I believe you have problems with MPI. Before I experienced something alike on our old system - serial version worked OK but parallel one failed. The same

Re: [gmx-users] Re: wierd behavior of mdrun

2009-09-04 Thread Paymon Pirzadeh
= 5.5e-5 > > ;ref_p= 1.0 > > gen_vel = yes > > gen_temp = 275 > > gen_seed = 173529 > > constraint-algorithm = Lincs > > constraints = all-bonds > > lincs-order = 4 > > > > Regards, > &g

Re: [gmx-users] Re: wierd behavior of mdrun

2009-08-28 Thread Justin A. Lemkul
ers Subject: Re: [gmx-users] Re: wierd behavior of mdrun Vitaly V. Chaban wrote: Then I believe you have problems with MPI. Before I experienced something alike on our old system - serial version worked OK but parallel one failed. The same issue was with CPMD by the way. Another programs worked

RE: [gmx-users] Re: wierd behavior of mdrun

2009-08-28 Thread Payman Pirzadeh
...@gromacs.org] On Behalf Of Mark Abraham Sent: August 27, 2009 3:32 PM To: Discussion list for GROMACS users Subject: Re: [gmx-users] Re: wierd behavior of mdrun Vitaly V. Chaban wrote: > Then I believe you have problems with MPI. > > Before I experienced something alike on our old system

Re: [gmx-users] Re: wierd behavior of mdrun

2009-08-28 Thread Mark Abraham
Vitaly V. Chaban wrote: Then I believe you have problems with MPI. Before I experienced something alike on our old system - serial version worked OK but parallel one failed. The same issue was with CPMD by the way. Another programs worked fine. I didn't correct that problem... On Thu, Aug 27, 2

[gmx-users] Re: wierd behavior of mdrun

2009-08-27 Thread Vitaly V. Chaban
Then I believe you have problems with MPI. Before I experienced something alike on our old system - serial version worked OK but parallel one failed. The same issue was with CPMD by the way. Another programs worked fine. I didn't correct that problem... On Thu, Aug 27, 2009 at 7:14 PM, Paymon Pir

[gmx-users] Re: wierd behavior of mdrun

2009-08-26 Thread Vitaly V. Chaban
> I made a .tpr file for my md run without any problems (using the bottom > mdp file). My job submission script is also the same thing I used for > other jobs which had no problems. But now when I submit this .tpr file, > only an empty log file is generated! The qstat of the cluster shows that > th