am
Sent: August 27, 2009 3:32 PM
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] Re: wierd behavior of mdrun
Vitaly V. Chaban wrote:
Then I believe you have problems with MPI.
Before I experienced something alike on our old system - serial
version worked OK but parallel one failed. The
gt; >>> ;epsilon_r= 1
> >>>
> >>> ; Vdw
> >>>
> >>> vdwtype = cut-off
> >>> rvdw = 1.2
> >>> DispCorr = EnerPres
> >>>
> >>> ;Ewald
> >>>
> >>> fourierspacing =
[mailto:gmx-users-boun...@gromacs.org]
On Behalf Of Justin A. Lemkul
Sent: Friday, September 04, 2009 5:29 PM
To: Gromacs Users' List
Subject: {Spam?} Re: [gmx-users] Re: wierd behavior of mdrun
Have you tried my suggestion from the last message of setting frequent
output?
Could your system ju
Behalf Of Mark Abraham
Sent: August 27, 2009 3:32 PM
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] Re: wierd behavior of mdrun
Vitaly V. Chaban wrote:
Then I believe you have problems with MPI.
Before I experienced something alike on our old system - serial
version worked OK but para
August 27, 2009 3:32 PM
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] Re: wierd behavior of mdrun
Vitaly V. Chaban wrote:
Then I believe you have problems with MPI.
Before I experienced something alike on our old system - serial
version worked OK but parallel one failed. The same
= 5.5e-5
> > ;ref_p= 1.0
> > gen_vel = yes
> > gen_temp = 275
> > gen_seed = 173529
> > constraint-algorithm = Lincs
> > constraints = all-bonds
> > lincs-order = 4
> >
> > Regards,
> &g
ers
Subject: Re: [gmx-users] Re: wierd behavior of mdrun
Vitaly V. Chaban wrote:
Then I believe you have problems with MPI.
Before I experienced something alike on our old system - serial
version worked OK but parallel one failed. The same issue was with
CPMD by the way. Another programs worked
...@gromacs.org]
On Behalf Of Mark Abraham
Sent: August 27, 2009 3:32 PM
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] Re: wierd behavior of mdrun
Vitaly V. Chaban wrote:
> Then I believe you have problems with MPI.
>
> Before I experienced something alike on our old system
Vitaly V. Chaban wrote:
Then I believe you have problems with MPI.
Before I experienced something alike on our old system - serial
version worked OK but parallel one failed. The same issue was with
CPMD by the way. Another programs worked fine. I didn't correct that
problem...
On Thu, Aug 27, 2
Then I believe you have problems with MPI.
Before I experienced something alike on our old system - serial
version worked OK but parallel one failed. The same issue was with
CPMD by the way. Another programs worked fine. I didn't correct that
problem...
On Thu, Aug 27, 2009 at 7:14 PM, Paymon Pir
> I made a .tpr file for my md run without any problems (using the bottom
> mdp file). My job submission script is also the same thing I used for
> other jobs which had no problems. But now when I submit this .tpr file,
> only an empty log file is generated! The qstat of the cluster shows that
> th
11 matches
Mail list logo