Hi, I have now solved the problem. I have added a list of the 1-4 pairs within [ Pairs ] and it has led to the correct equilibrium bond length. Just as reference, information on the forcefield I am using is found in this paper: http://pubs.acs.org/doi/abs/10.1021/cm500365c Cheers, Amanda
On 30/06/2017, 01:46, "gromacs.org_gmx-users-boun...@maillist.sys.kth.se on behalf of gromacs.org_gmx-users-requ...@maillist.sys.kth.se" <gromacs.org_gmx-users-boun...@maillist.sys.kth.se on behalf of gromacs.org_gmx-users-requ...@maillist.sys.kth.se> wrote: >Send gromacs.org_gmx-users mailing list submissions to > gromacs.org_gmx-users@maillist.sys.kth.se > >To subscribe or unsubscribe via the World Wide Web, visit > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users >or, via email, send a message with subject or body 'help' to > gromacs.org_gmx-users-requ...@maillist.sys.kth.se > >You can reach the person managing the list at > gromacs.org_gmx-users-ow...@maillist.sys.kth.se > >When replying, please edit your Subject line so it is more specific >than "Re: Contents of gromacs.org_gmx-users digest..." > > >Today's Topics: > > 1. Re: NoPbc (Mark Abraham) > 2. Re: RemvingDummyAtoms (Mark Abraham) > 3. Re: gromacs.org_gmx-users Digest, Vol 158, Issue 186 (Thanh Le) > 4. Re: gromacs.org_gmx-users Digest, Vol 158, Issue 186 > (Mark Abraham) > 5. Re: gromacs.org_gmx-users Digest, Vol 158, Issue 186 > (Mark Abraham) > 6. Re: Anybody using Silica InterfaceFF on Gromacs? (Alex) > > >---------------------------------------------------------------------- > >Message: 1 >Date: Thu, 29 Jun 2017 22:12:46 +0000 >From: Mark Abraham <mark.j.abra...@gmail.com> >To: gmx-us...@gromacs.org >Subject: Re: [gmx-users] NoPbc >Message-ID: > <camnumatjl1zfpgq+xj9pycz2secqgfuswdtg_jx9h8lqhj+...@mail.gmail.com> >Content-Type: text/plain; charset="UTF-8" > >On Thu, Jun 29, 2017 at 7:29 PM Mostafa Javaheri ><javaheri.grom...@gmail.com> >wrote: > >> Dear gmx users >> >> I'm doing QM/MM simulation by using ORCA-Gromacs interface, the whole >> simulation will take only 1 ps and I need to set pbc off. For this short >> period of simulation does the pressure of the box change? > > >There's no box if pbc is off. What would its walls be made of? Thus, no >external pressure. > > >> Does the water >> vaporize after 500 steps? Any helps will be appreciated. >> > >Water will start to evaporate, but probably not noticeable in that time >frame. Run a classical MD with the same setup for 1 ps on just a globule >of >water to see. > >Mark > > >> Regards >> >> M.Javaheri >> -- >> Gromacs Users mailing list >> >> * Please search the archive at >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >> posting! >> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >> >> * For (un)subscribe requests visit >> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >> send a mail to gmx-users-requ...@gromacs.org. >> > > >------------------------------ > >Message: 2 >Date: Thu, 29 Jun 2017 22:13:36 +0000 >From: Mark Abraham <mark.j.abra...@gmail.com> >To: gmx-us...@gromacs.org >Subject: Re: [gmx-users] RemvingDummyAtoms >Message-ID: > <camnumaqepbc4+kcobp+tny7-hevufz-seacsh4kg5uqic6u...@mail.gmail.com> >Content-Type: text/plain; charset="UTF-8" > >Hi, > >gmx trjconv can apply a selection (e.g. generated with gmx select) to your >trajectory frames. gmx convert-tpr can do the same to your tpr, which you >might need to help analysis work later on. > >Mark > >On Thu, Jun 29, 2017 at 10:57 PM Mostafa Javaheri < >javaheri.grom...@gmail.com> wrote: > >> Dear gmx users >> >> I really appreciate it if anyone could tell me how could I remove dummy >> atoms from my trajectory? I made them before running the simulation for >> adding repulsive potential and now for analyzing the results there is no >> need for them. >> >> Regards >> >> M.Javaheri >> -- >> Gromacs Users mailing list >> >> * Please search the archive at >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >> posting! >> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >> >> * For (un)subscribe requests visit >> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >> send a mail to gmx-users-requ...@gromacs.org. >> > > >------------------------------ > >Message: 3 >Date: Thu, 29 Jun 2017 15:20:44 -0700 >From: Thanh Le <thanh.q...@sjsu.edu> >To: gromacs.org_gmx-users@maillist.sys.kth.se >Subject: Re: [gmx-users] gromacs.org_gmx-users Digest, Vol 158, Issue > 186 >Message-ID: <cd7273ed-6323-4343-b324-0c2a80c13...@sjsu.edu> >Content-Type: text/plain; charset=us-ascii > >Hi Mr. Abraham. >My system is quite small, only about 8000 atoms. I have run this system >for 100 ns, which took roughly about 2 days. Hence, a run of 1 >microsecond would take about 20 days. I am trying to shorten it down to 2 >days by using more than 1 node. >Thanks, >Thanh Le >> On Jun 29, 2017, at 3:10 PM, >>gromacs.org_gmx-users-requ...@maillist.sys.kth.se wrote: >> >>> http://www.gromacs.org/Support/Mailing_Lists >>><http://www.gromacs.org/Support/Mailing_Lists> > > >------------------------------ > >Message: 4 >Date: Thu, 29 Jun 2017 22:37:54 +0000 >From: Mark Abraham <mark.j.abra...@gmail.com> >To: gmx-us...@gromacs.org, gromacs.org_gmx-users@maillist.sys.kth.se >Subject: Re: [gmx-users] gromacs.org_gmx-users Digest, Vol 158, Issue > 186 >Message-ID: > <camnumar2lzxr-6fz+jqderddommxcpfq7hmnxzak+shuzpa...@mail.gmail.com> >Content-Type: text/plain; charset="UTF-8" > >Hi, > >Don't even consider running on more than one node. You can see for >yourself >by comparing the performance of even just e.g. > >gmx mdrun -nt 1 -pin on >gmx mdrun -nt 2 -pin on >gmx mdrun -nt 14 -pin on >gmx mdrun -nt 28 -pin on > >... to run on different numbers of cores. Parallel efficiency drops off as >you approach 100 atoms per core. > >Further, the factor of seven in the core count is a surefire way to be >inefficient, because the domain decomposition will have to partition in >seven domains in one direction. I would consider running three simulations >per node, with 9,10,9 cores per simulation, using gmx mdrun -nt x -pin on >-pin_offset y for suitable x and y. But try the above experiment first. > >Mark > >On Fri, Jun 30, 2017 at 12:21 AM Thanh Le <thanh.q...@sjsu.edu> wrote: > >> Hi Mr. Abraham. >> My system is quite small, only about 8000 atoms. I have run this system >> for 100 ns, which took roughly about 2 days. Hence, a run of 1 >>microsecond >> would take about 20 days. I am trying to shorten it down to 2 days by >>using >> more than 1 node. >> Thanks, >> Thanh Le >> > On Jun 29, 2017, at 3:10 PM, >> gromacs.org_gmx-users-requ...@maillist.sys.kth.se wrote: >> > >> >> http://www.gromacs.org/Support/Mailing_Lists < >> http://www.gromacs.org/Support/Mailing_Lists> >> -- >> Gromacs Users mailing list >> >> * Please search the archive at >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >> posting! >> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >> >> * For (un)subscribe requests visit >> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >> send a mail to gmx-users-requ...@gromacs.org. >> > > >------------------------------ > >Message: 5 >Date: Thu, 29 Jun 2017 22:37:54 +0000 >From: Mark Abraham <mark.j.abra...@gmail.com> >To: gmx-us...@gromacs.org, gromacs.org_gmx-users@maillist.sys.kth.se >Subject: Re: [gmx-users] gromacs.org_gmx-users Digest, Vol 158, Issue > 186 >Message-ID: > <camnumar2lzxr-6fz+jqderddommxcpfq7hmnxzak+shuzpa...@mail.gmail.com> >Content-Type: text/plain; charset="UTF-8" > >Hi, > >Don't even consider running on more than one node. You can see for >yourself >by comparing the performance of even just e.g. > >gmx mdrun -nt 1 -pin on >gmx mdrun -nt 2 -pin on >gmx mdrun -nt 14 -pin on >gmx mdrun -nt 28 -pin on > >... to run on different numbers of cores. Parallel efficiency drops off as >you approach 100 atoms per core. > >Further, the factor of seven in the core count is a surefire way to be >inefficient, because the domain decomposition will have to partition in >seven domains in one direction. I would consider running three simulations >per node, with 9,10,9 cores per simulation, using gmx mdrun -nt x -pin on >-pin_offset y for suitable x and y. But try the above experiment first. > >Mark > >On Fri, Jun 30, 2017 at 12:21 AM Thanh Le <thanh.q...@sjsu.edu> wrote: > >> Hi Mr. Abraham. >> My system is quite small, only about 8000 atoms. I have run this system >> for 100 ns, which took roughly about 2 days. Hence, a run of 1 >>microsecond >> would take about 20 days. I am trying to shorten it down to 2 days by >>using >> more than 1 node. >> Thanks, >> Thanh Le >> > On Jun 29, 2017, at 3:10 PM, >> gromacs.org_gmx-users-requ...@maillist.sys.kth.se wrote: >> > >> >> http://www.gromacs.org/Support/Mailing_Lists < >> http://www.gromacs.org/Support/Mailing_Lists> >> -- >> Gromacs Users mailing list >> >> * Please search the archive at >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >> posting! >> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >> >> * For (un)subscribe requests visit >> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >> send a mail to gmx-users-requ...@gromacs.org. >> > > >------------------------------ > >Message: 6 >Date: Thu, 29 Jun 2017 18:46:02 -0600 >From: Alex <nedoma...@gmail.com> >To: Discussion list for GROMACS users <gmx-us...@gromacs.org> >Subject: Re: [gmx-users] Anybody using Silica InterfaceFF on Gromacs? >Message-ID: > <camjz6qfiwfczpes-i+engwi9y+hdxdsnaq6pmsa010nzxr-...@mail.gmail.com> >Content-Type: text/plain; charset="UTF-8" > >> >> >>> >>> He he, childish :) >> >> David, no offense intended. I just think that when applied to solids, >>the >entire concept of what works so well for biomolecular systems becomes a >bit >of a joke. And vice versa, to be fair. Spoken from experience, really -- >we >here used Gromacs to simulate things that I keep telling people not to >simulate with Gromacs, and it got published!. :) > >In any case, I second what was said above re: # of exclusions. Solid-state >potentials use smooth drop-offs to exclude long-range interactions between >close neighbor sharing elements, so looking into David's suggestion may in >fact fix the issues immediately. > >Alex > > >------------------------------ > >-- >Gromacs Users mailing list > >* Please search the archive at >http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >posting! > >* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists > >* For (un)subscribe requests visit >https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >send a mail to gmx-users-requ...@gromacs.org. > >End of gromacs.org_gmx-users Digest, Vol 158, Issue 187 >******************************************************* -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.