Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-16 Thread Jan Jirsák
Hi Szilárd,
thank you for you reaction!

I prepared a simple input which reproduces the behavior, see

https://owncloud.cesnet.cz/public.php?service=files&t=9e8fc3924fd109f6d48a552e20ceaf11

With -nt 1 it successfully finishes in ca 12 seconds, with -nt 2 it
gets stuck after SETTLE citation.

Regards, Jan

2015-06-15 21:51 GMT+02:00 Szilárd Páll :
> On Sun, Jun 14, 2015 at 6:54 PM, Jan Jirsák  wrote:
>> Hi,
>>
>> I did the test and found out that -nt 8 is even slower than -nt 1 !!
>
> FYI: "-nt" is mostly backward compatibility option and for clarity
> it's best to use either -ntmpi or -ntomp, depending on what you mean.
>
>> However, I think that simulation hasn't even properly started with 8
>> threads and got stuck somewhere in the beginning.
>>
>> Details:
>> I used a short run (1000 steps) for testing. mdrun -nt 1 finished
>> after ca 11 hours, whereas not a single checkpoint or run log record
>> have been saved for mdrun -nt 8 (and simulation is still running). Not
>> even starting energies were displayed - last record in a log file is a
>> SETTLE citation.
>
> That's probably a stuck which should not happen and I'd consider that
> a bug. Can you share your input files?
>
>> In top I noticed both processes used up to 100% CPU, but considerably
>> differed in the use of memory. Relevant lines:
>> PID USER  PR  NIVIRTRESSHR S  %CPU %MEM TIME+ COMMAND
>> 117710 jan   20   0 5313776 4.642g   9940 R  98.9  3.7  52:47.19
>> mdrun
>> 117686 jan   20   0  547344  24456   8712 R  92.7  0.0  54:48.92 mdrun
>
> That definitely does not look good.
>
> BTW, in contrast with Mark's opinion I do think top is indicative. If
> you wanted to use 8 cores and told mdrun to do so but you get load
> only on one core, you can be (almost) certain that something is not
> right.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-15 Thread Szilárd Páll
On Sun, Jun 14, 2015 at 6:54 PM, Jan Jirsák  wrote:
> Hi,
>
> I did the test and found out that -nt 8 is even slower than -nt 1 !!

FYI: "-nt" is mostly backward compatibility option and for clarity
it's best to use either -ntmpi or -ntomp, depending on what you mean.

> However, I think that simulation hasn't even properly started with 8
> threads and got stuck somewhere in the beginning.
>
> Details:
> I used a short run (1000 steps) for testing. mdrun -nt 1 finished
> after ca 11 hours, whereas not a single checkpoint or run log record
> have been saved for mdrun -nt 8 (and simulation is still running). Not
> even starting energies were displayed - last record in a log file is a
> SETTLE citation.

That's probably a stuck which should not happen and I'd consider that
a bug. Can you share your input files?

> In top I noticed both processes used up to 100% CPU, but considerably
> differed in the use of memory. Relevant lines:
> PID USER  PR  NIVIRTRESSHR S  %CPU %MEM TIME+ COMMAND
> 117710 jan   20   0 5313776 4.642g   9940 R  98.9  3.7  52:47.19
> mdrun
> 117686 jan   20   0  547344  24456   8712 R  92.7  0.0  54:48.92 mdrun

That definitely does not look good.

BTW, in contrast with Mark's opinion I do think top is indicative. If
you wanted to use 8 cores and told mdrun to do so but you get load
only on one core, you can be (almost) certain that something is not
right.

--
Szilárd

> The first listed process (117710) is the single thread simulation (-nt 1).
>
> Thank you for any insight,
> Jan
>
> 2015-06-12 14:35 GMT+02:00 Mark Abraham :
>> Hi,
>>
>> Top need not be indicative. Run the same with -nt 1 and observe whether the
>> performance changes. And don't run anything else on these cores.
>>
>> Mark
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-14 Thread Jan Jirsák
Ok, thanks -- I see I will have to go back to version 4.5 or 4.6 where
my code worked flawlessly (particle decomposition is available there).

Regarding "flying away" -- sometimes I NEED my system to disintegrate
to fragments which fly away;) Momentum is of course conserved, cause
they fly in opposite directions.

Thank you for the links, I will have a look.
With kind regards
Jan

2015-06-14 19:32 GMT+02:00 V.V.Chaban :
> There are just two options for such simulations. Either one uses a
> code supporting particle decomposition or adds a vacuum layer.
>
> The fragments will not fly away provided that momentum is conserved.
>
> Some comments on why and what to select may be present here:
>
> http://pubs.acs.org/doi/abs/10.1021/jz500563q
>
> http://pubs.acs.org/doi/abs/10.1021/jz300405q
>
> http://pubs.acs.org/doi/abs/10.1021/jz201190j
>
>
>
>
>
>
> On Sun, Jun 14, 2015 at 2:24 PM, Jan Jirsák  wrote:
>> Thank you for a reply - I could try that when everything else fails, but
>> (i) my system can disintegrate and its fragments can fly really long
>> distances apart, so the layer of vacuum would have to be considerably
>> (and unpredictably) thick - the same applies to cutoffs,
>> (ii) in principle I don't see why one should bother with periodic
>> images in a simulation which is inherently non-periodic.
>> Regards, Jan
>>
>>
>> 2015-06-14 19:01 GMT+02:00 V.V.Chaban :
>>> Why don't you simply surround your molecule with a layer of vacuum and
>>> use all periodic set of details as normally? This is what people do in
>>> the plane-wave codes when the so-called 'cluster representation' is
>>> desired.
>>>
>>>
>>>
>>>
>>> On Sun, Jun 14, 2015 at 1:54 PM, Jan Jirsák  wrote:
 Hi,

 I did the test and found out that -nt 8 is even slower than -nt 1 !!
 However, I think that simulation hasn't even properly started with 8
 threads and got stuck somewhere in the beginning.

 Details:
 I used a short run (1000 steps) for testing. mdrun -nt 1 finished
 after ca 11 hours, whereas not a single checkpoint or run log record
 have been saved for mdrun -nt 8 (and simulation is still running). Not
 even starting energies were displayed - last record in a log file is a
 SETTLE citation.

 In top I noticed both processes used up to 100% CPU, but considerably
 differed in the use of memory. Relevant lines:
 PID USER  PR  NIVIRTRESSHR S  %CPU %MEM TIME+ COMMAND
 117710 jan   20   0 5313776 4.642g   9940 R  98.9  3.7  52:47.19
 mdrun
 117686 jan   20   0  547344  24456   8712 R  92.7  0.0  54:48.92 mdrun

 The first listed process (117710) is the single thread simulation (-nt 1).

 Thank you for any insight,
 Jan

 2015-06-12 14:35 GMT+02:00 Mark Abraham :
> Hi,
>
> Top need not be indicative. Run the same with -nt 1 and observe whether 
> the
> performance changes. And don't run anything else on these cores.
>
> Mark
 --
 Gromacs Users mailing list

 * Please search the archive at 
 http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

 * For (un)subscribe requests visit
 https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send 
 a mail to gmx-users-requ...@gromacs.org.
>>> --
>>> Gromacs Users mailing list
>>>
>>> * Please search the archive at 
>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>>
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send 
>>> a mail to gmx-users-requ...@gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-14 Thread V.V.Chaban
There are just two options for such simulations. Either one uses a
code supporting particle decomposition or adds a vacuum layer.

The fragments will not fly away provided that momentum is conserved.

Some comments on why and what to select may be present here:

http://pubs.acs.org/doi/abs/10.1021/jz500563q

http://pubs.acs.org/doi/abs/10.1021/jz300405q

http://pubs.acs.org/doi/abs/10.1021/jz201190j






On Sun, Jun 14, 2015 at 2:24 PM, Jan Jirsák  wrote:
> Thank you for a reply - I could try that when everything else fails, but
> (i) my system can disintegrate and its fragments can fly really long
> distances apart, so the layer of vacuum would have to be considerably
> (and unpredictably) thick - the same applies to cutoffs,
> (ii) in principle I don't see why one should bother with periodic
> images in a simulation which is inherently non-periodic.
> Regards, Jan
>
>
> 2015-06-14 19:01 GMT+02:00 V.V.Chaban :
>> Why don't you simply surround your molecule with a layer of vacuum and
>> use all periodic set of details as normally? This is what people do in
>> the plane-wave codes when the so-called 'cluster representation' is
>> desired.
>>
>>
>>
>>
>> On Sun, Jun 14, 2015 at 1:54 PM, Jan Jirsák  wrote:
>>> Hi,
>>>
>>> I did the test and found out that -nt 8 is even slower than -nt 1 !!
>>> However, I think that simulation hasn't even properly started with 8
>>> threads and got stuck somewhere in the beginning.
>>>
>>> Details:
>>> I used a short run (1000 steps) for testing. mdrun -nt 1 finished
>>> after ca 11 hours, whereas not a single checkpoint or run log record
>>> have been saved for mdrun -nt 8 (and simulation is still running). Not
>>> even starting energies were displayed - last record in a log file is a
>>> SETTLE citation.
>>>
>>> In top I noticed both processes used up to 100% CPU, but considerably
>>> differed in the use of memory. Relevant lines:
>>> PID USER  PR  NIVIRTRESSHR S  %CPU %MEM TIME+ COMMAND
>>> 117710 jan   20   0 5313776 4.642g   9940 R  98.9  3.7  52:47.19
>>> mdrun
>>> 117686 jan   20   0  547344  24456   8712 R  92.7  0.0  54:48.92 mdrun
>>>
>>> The first listed process (117710) is the single thread simulation (-nt 1).
>>>
>>> Thank you for any insight,
>>> Jan
>>>
>>> 2015-06-12 14:35 GMT+02:00 Mark Abraham :
 Hi,

 Top need not be indicative. Run the same with -nt 1 and observe whether the
 performance changes. And don't run anything else on these cores.

 Mark
>>> --
>>> Gromacs Users mailing list
>>>
>>> * Please search the archive at 
>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>>
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send 
>>> a mail to gmx-users-requ...@gromacs.org.
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at 
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
>> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-14 Thread Jan Jirsák
Thank you for a reply - I could try that when everything else fails, but
(i) my system can disintegrate and its fragments can fly really long
distances apart, so the layer of vacuum would have to be considerably
(and unpredictably) thick - the same applies to cutoffs,
(ii) in principle I don't see why one should bother with periodic
images in a simulation which is inherently non-periodic.
Regards, Jan


2015-06-14 19:01 GMT+02:00 V.V.Chaban :
> Why don't you simply surround your molecule with a layer of vacuum and
> use all periodic set of details as normally? This is what people do in
> the plane-wave codes when the so-called 'cluster representation' is
> desired.
>
>
>
>
> On Sun, Jun 14, 2015 at 1:54 PM, Jan Jirsák  wrote:
>> Hi,
>>
>> I did the test and found out that -nt 8 is even slower than -nt 1 !!
>> However, I think that simulation hasn't even properly started with 8
>> threads and got stuck somewhere in the beginning.
>>
>> Details:
>> I used a short run (1000 steps) for testing. mdrun -nt 1 finished
>> after ca 11 hours, whereas not a single checkpoint or run log record
>> have been saved for mdrun -nt 8 (and simulation is still running). Not
>> even starting energies were displayed - last record in a log file is a
>> SETTLE citation.
>>
>> In top I noticed both processes used up to 100% CPU, but considerably
>> differed in the use of memory. Relevant lines:
>> PID USER  PR  NIVIRTRESSHR S  %CPU %MEM TIME+ COMMAND
>> 117710 jan   20   0 5313776 4.642g   9940 R  98.9  3.7  52:47.19
>> mdrun
>> 117686 jan   20   0  547344  24456   8712 R  92.7  0.0  54:48.92 mdrun
>>
>> The first listed process (117710) is the single thread simulation (-nt 1).
>>
>> Thank you for any insight,
>> Jan
>>
>> 2015-06-12 14:35 GMT+02:00 Mark Abraham :
>>> Hi,
>>>
>>> Top need not be indicative. Run the same with -nt 1 and observe whether the
>>> performance changes. And don't run anything else on these cores.
>>>
>>> Mark
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at 
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
>> mail to gmx-users-requ...@gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-14 Thread V.V.Chaban
Why don't you simply surround your molecule with a layer of vacuum and
use all periodic set of details as normally? This is what people do in
the plane-wave codes when the so-called 'cluster representation' is
desired.




On Sun, Jun 14, 2015 at 1:54 PM, Jan Jirsák  wrote:
> Hi,
>
> I did the test and found out that -nt 8 is even slower than -nt 1 !!
> However, I think that simulation hasn't even properly started with 8
> threads and got stuck somewhere in the beginning.
>
> Details:
> I used a short run (1000 steps) for testing. mdrun -nt 1 finished
> after ca 11 hours, whereas not a single checkpoint or run log record
> have been saved for mdrun -nt 8 (and simulation is still running). Not
> even starting energies were displayed - last record in a log file is a
> SETTLE citation.
>
> In top I noticed both processes used up to 100% CPU, but considerably
> differed in the use of memory. Relevant lines:
> PID USER  PR  NIVIRTRESSHR S  %CPU %MEM TIME+ COMMAND
> 117710 jan   20   0 5313776 4.642g   9940 R  98.9  3.7  52:47.19
> mdrun
> 117686 jan   20   0  547344  24456   8712 R  92.7  0.0  54:48.92 mdrun
>
> The first listed process (117710) is the single thread simulation (-nt 1).
>
> Thank you for any insight,
> Jan
>
> 2015-06-12 14:35 GMT+02:00 Mark Abraham :
>> Hi,
>>
>> Top need not be indicative. Run the same with -nt 1 and observe whether the
>> performance changes. And don't run anything else on these cores.
>>
>> Mark
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-14 Thread Jan Jirsák
Hi,

I did the test and found out that -nt 8 is even slower than -nt 1 !!
However, I think that simulation hasn't even properly started with 8
threads and got stuck somewhere in the beginning.

Details:
I used a short run (1000 steps) for testing. mdrun -nt 1 finished
after ca 11 hours, whereas not a single checkpoint or run log record
have been saved for mdrun -nt 8 (and simulation is still running). Not
even starting energies were displayed - last record in a log file is a
SETTLE citation.

In top I noticed both processes used up to 100% CPU, but considerably
differed in the use of memory. Relevant lines:
PID USER  PR  NIVIRTRESSHR S  %CPU %MEM TIME+ COMMAND
117710 jan   20   0 5313776 4.642g   9940 R  98.9  3.7  52:47.19
mdrun
117686 jan   20   0  547344  24456   8712 R  92.7  0.0  54:48.92 mdrun

The first listed process (117710) is the single thread simulation (-nt 1).

Thank you for any insight,
Jan

2015-06-12 14:35 GMT+02:00 Mark Abraham :
> Hi,
>
> Top need not be indicative. Run the same with -nt 1 and observe whether the
> performance changes. And don't run anything else on these cores.
>
> Mark
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-12 Thread Mark Abraham
Hi,

Top need not be indicative. Run the same with -nt 1 and observe whether the
performance changes. And don't run anything else on these cores.

Mark

On Fri, 12 Jun 2015 14:20 Jan Jirsák  wrote:

> Hi,
>
> I have one more problem with running this system with thread-MPI
> (tested in both 5.0.4 and 5.0.5 on two different machines). When I set
> everything as you advised me, it runs, however top command shows only
> 100% load - i.e., only single CPU is used (and it is really very very
> slow), despite log files confirm the desired number of MPI threads was
> used.
>
> Thanks, Jan
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-12 Thread Jan Jirsák
Hi,

I have one more problem with running this system with thread-MPI
(tested in both 5.0.4 and 5.0.5 on two different machines). When I set
everything as you advised me, it runs, however top command shows only
100% load - i.e., only single CPU is used (and it is really very very
slow), despite log files confirm the desired number of MPI threads was
used.

Thanks, Jan
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-11 Thread Mark Abraham
Hi,

That particular optimization hasn't been implemented in the multi-domain
case. It's on the table for post 5.1, however.

Mark

On Thu, 11 Jun 2015 17:45 Jan Jirsák  wrote:

> David van der Spoel  writes:
> > Use grid search in any case. It supports vacuum.
>
> Thank you very much, it seems to work. I wonder, does nstlist variable have
> any relevance in this case? I mean, here all particles interact with one
> another, so it should be sufficient to build neighbor list just once, but
> still Gromacs does not allow me to set nstlist=0.
>
> Thanks, Jan
>
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-11 Thread Jan Jirsák
Justin Lemkul  writes:
> 
> Use mdrun -nt 1
> 

Thank you for a quick reply - however, I really need to parallelize - single
CPU run would take ages;)
Regards, Jan 




-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-11 Thread Jan Jirsák
David van der Spoel  writes:
> Use grid search in any case. It supports vacuum.

Thank you very much, it seems to work. I wonder, does nstlist variable have
any relevance in this case? I mean, here all particles interact with one
another, so it should be sufficient to build neighbor list just once, but
still Gromacs does not allow me to set nstlist=0.

Thanks, Jan



-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-11 Thread Justin Lemkul



On 6/11/15 5:53 AM, Jan Jirsák wrote:

Hello everyone,

what is the correct setup for simulations with no PBC and no cuttoffs in
Gromacs 5.0.4?

In versions 4.5 and 4.6 i used
nstlist = 0
ns_type = simple
pbc = no

This no longer works, as I get the error:
"Domain decomposition does not support simple neighbor searching, use grid
searching or run with one MPI rank"
(... and particle decomposition i no longer available.)

However, when I run

mdrun -nt 16 -ntmpi 1 (is this correct specification for 1 MPI rank?)

I get the error that OpenMP can be used only with cutoffscheme=Verlet, which
in turn is not available for pbc = no.

Grid searching is nonsense in this situation, as cuttoffs are infinite.

I seem to be in a dead end. Does anybody know a solution?



Use mdrun -nt 1

-Justin

--
==

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 629
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Vacuum simulation in Gromacs 5

2015-06-11 Thread David van der Spoel

On 11/06/15 11:53, Jan Jirsák wrote:

Hello everyone,

what is the correct setup for simulations with no PBC and no cuttoffs in
Gromacs 5.0.4?

In versions 4.5 and 4.6 i used
nstlist = 0
ns_type = simple
pbc = no

This no longer works, as I get the error:
"Domain decomposition does not support simple neighbor searching, use grid
searching or run with one MPI rank"
(... and particle decomposition i no longer available.)

However, when I run

mdrun -nt 16 -ntmpi 1 (is this correct specification for 1 MPI rank?)

I get the error that OpenMP can be used only with cutoffscheme=Verlet, which
in turn is not available for pbc = no.

Grid searching is nonsense in this situation, as cuttoffs are infinite.

I seem to be in a dead end. Does anybody know a solution?


Use grid search in any case. It supports vacuum.


Thank you,
Jan





--
David van der Spoel, Ph.D., Professor of Biology
Dept. of Cell & Molec. Biol., Uppsala University.
Box 596, 75124 Uppsala, Sweden. Phone:  +46184714205.
sp...@xray.bmc.uu.sehttp://folding.bmc.uu.se
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.