Mark Abraham wrote:
On 12/07/2011 4:51 AM, Justin A. Lemkul wrote:

Hi All,

I'm having a strange problem with some implicit solvent systems and I'm wondering if anyone's experienced the same problem or if I've stumbled upon a bug. I'm testing some methodology with a robust system (everyone's favorite, 1AKI lysozyme). Running simulations with finite cutoffs works fine (in serial or parallel using DD), but if I try to call the all-vs-all kernels (e.g. infinite cutoffs), the simulations instantly crash (at step 0) with LINCS warnings. Running in serial, however, produces perfectly stable trajectories using all-vs-all, but the simulations are very slow.

The effects are independent of integrator (tested sd and md) and hardware/compilers. The problem is reproducible on two very different systems:

1. x86_64 via OpenMPI 1.4.2 and compiled with cmake with gcc-4.3.4

2. Mac OSX with threads and compiled via autoconf with gcc-4.4.4

All force fields that I've tested (OPLS-AA, CHARMM27, and AMBER03) give the same result.

The problem is present in version 4.5.4 and the most current release-4-5-patches. My .mdp file and a gdb backtrace are provided at the end of this message.

Is there anything else to try, or should I just file a redmine issue with a .tpr file?

Judging by the .mdp file, I have similar systems working just fine in parallel. If you upload the .tpr to Redmine, I'll try it out.


Thanks Mark.  I've created issue #777 for this.

-Justin

--
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================
--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to