Hi,
Ah well, we sometimes get reports of strange behaviour with PLUMED. We have
no idea how well things work, after they've changed the source code, so you
might have better luck enquiring on their fora.
Mark
On Fri, Jun 22, 2018 at 4:14 PM Stefano Guglielmo <
stefano.guglie...@unito.it> wrote:
Actually this is not a cluster but a single machine with two cpu's and 16
cores/cpu and in fact the non-MPI (tMPI) version works fine; I needed to
switch to the MPI version because I patched plumed which does not recognize
tMPI and need gromacs to be compiled with the same MPI used for it.
Hi,
Just exiting without a simulation crash suggests different problems, eg
that your cluster lost the network for MPI to use, etc. Talk to your system
administrators about their experience and thoughts there.
Mark
On Fri, Jun 22, 2018, 15:57 Stefano Guglielmo
wrote:
> Thanks Mark. I must say
Thanks Mark. I must say that I tried reducing timestep (as low as 0.5 fs-1)
and temperature as well (5K both in NvT and NpT) but the simulation crashed
in any case with warning about cutoff, lincs or bad water. The very
previous day I had run the same simulation on the same machine but with the
Hi,
Recompiling won't fix anything relevant. Molecular frustration isn't just
about long bonds. Rearrangment of side chain groupings can do similar
things despite looking happy. The sudden injection of KE means collisions
can be more violent than normal, and the timestep is now too large. But you
Dear Mark,
thanks for your answer. The version is 2016.5, but I apparently solved the
problem recompiling gromacs: now the simulation is running quite stable. I
had minimized and gradually equilibrated the system and I could not see any
weird bonds or contacts. So in the end, as extrema ratio, I
Hi,
This could easily be that your system is actually not yet well equilibrated
(e.g. something was trapped in a high energy state that eventually relaxed
sharply). Or it could be a code bug. What version were you using?
Mark
On Thu, Jun 21, 2018 at 2:36 PM Stefano Guglielmo <
Dear users,
I have installed gromacs with MPI instead of its native tMPI and I am
encountering the following error:
"Fatal error:
4 particles communicated to PME rank 5 are more than 2/3 times the cut-off
out
of the domain decomposition cell of their charge group in dimension y.
This usually