[gmx-users] Core dumped in NVT
Hey all, I'm trying to simulate a peptide-protein docked complex. I docked the peptide using pepATTRACT package. After analysis, I selected the best pose and tried to simulate. This yielded an error and got core dumped in NVT equilibriation. I'm using GROMOS 54a7 force field. The error is pasted below: GROMACS: gmx mdrun, VERSION 5.1.4 Executable: /usr/local/gromacs/bin/gmx Data prefix: /usr/local/gromacs Command line: gmx mdrun -v -deffnm nvt Running on 1 node with total 2 cores, 4 logical cores Hardware detected: CPU info: Vendor: GenuineIntel Brand: Intel(R) Core(TM) i3-4030U CPU @ 1.90GHz SIMD instructions most likely to fit this hardware: AVX2_256 SIMD instructions selected at GROMACS compile time: AVX2_256 Reading file nvt.tpr, VERSION 5.1.4 (single precision) Changing nstlist from 10 to 25, rlist from 1.4 to 1.434 Using 1 MPI thread Using 4 OpenMP threads Step 0, time 0 (ps) LINCS WARNING relative constraint deviation after LINCS: rms 0.056487, max 0.946381 (between atoms 2400 and 2402) bonds that rotated more than 30 degrees: atom 1 atom 2 angle previous, current, constraint length Wrote pdb files with previous and current coordinates Step 0, time 0 (ps) LINCS WARNING relative constraint deviation after LINCS: rms 0.220257, max 3.677571 (between atoms 94 and 95) bonds that rotated more than 30 degrees: atom 1 atom 2 angle previous, current, constraint length 55 56 31.80.1015 0.0956 0.1000 55 57 52.10.2349 0.2349 0.1470 57 58 48.00.2224 0.2228 0.1530 57 65 89.80.2299 0.3183 0.1530 65 66 69.60.1477 0.2354 0.1230 65 67 83.70.1946 0.2569 0.1330 67 68 68.20.1047 0.1408 0.1000 67 69 50.90.2300 0.2789 0.1470 69 70 63.00.2227 0.3403 0.1530 69 82 67.30.2495 0.3697 0.1530 70 71 54.10.1729 0.2028 0.1530 71 72 36.80.1991 0.2163 0.1530 82 83 55.40.1253 0.1783 0.1230 82 84 63.50.2539 0.2903 0.1330 84 85 62.50.1027 0.1596 0.1000 84 86 62.50.2388 0.4224 0.1470 86 87 79.90.1946 0.1769 0.1530 86 90 93.50.2100 0.2881 0.1530 90 91 37.70.1291 0.3171 0.1230 92 93 151.50.1057 0.1022 0.1000 92 94 157.50.1820 0.2864 0.1470 95 96 171.90.1609 0.1718 0.1530 95 97 169.20.1610 0.1759 0.1530 98 99 155.30.1402 0.1854 0.1230 98100 162.40.1610 0.2476 0.1330 2386 2388 169.40.1544 0.2630 0.1470 2388 2389 163.70.1629 0.3605 0.1530 2392 2394 159.80.1330 0.2451 0.1330 2394 2395 150.30.0977 0.1349 0.1000 2396 2400 156.70.1776 0.1941 0.1530 2397 2398 166.40.1396 0.0462 0.1430 2400 2401 133.50.1069 0.1620 0.1230 2402 2403 48.60.1129 0.1661 0.1000 2402 2404 102.90.2098 0.2774 0.1470 2404 2411 34.60.2530 0.4458 0.1530 2405 2406 170.90.1595 0.1029 0.1530 2411 2412 46.50.1265 0.0735 0.1230 2411 2413 63.40.1713 0.1585 0.1330 2413 2414 35.30.1044 0.1095 0.1000 2413 2415 52.20.1872 0.0940 0.1470 2420 2421 88.50.1262 0.0936 0.1230 2420 2422 95.50.1656 0.2391 0.1330 2422 2423 49.70.1164 0.1810 0.1000 2422 2424 57.50.2148 0.3306 0.1470 2424 2425 54.20.1857 0.1855 0.1530 2424 2430 45.80.1815 0.1585 0.1530 Back Off! I just backed up step0b.pdb to ./#step0b.pdb.1# Back Off! I just backed up step0c.pdb to ./#step0c.pdb.1# Wrote pdb files with previous and current coordinates starting mdrun 'Protein in water' 25 steps,500.0 ps. Step 0, time 0 (ps) LINCS WARNING relative constraint deviation after LINCS: rms 3.611257, max 94.899414 (between atoms 57 and 65) bonds that rotated more than 30 degrees: atom 1 atom 2 angle previous, current, constraint length 50 52 69.30.1533 0.0680 0.1470 52 53 32.50.1728 0.2443 0.1530 53 54 41.00.1367 0.2362 0.1230 53 55 115.40.1420 1.1222 0.1330 55 56 100.20.1015 1.1348 0.1000 55 57 160.70.2349 3.2040 0.1470 57 58 130.50.2224 3.9408 0.1530 57 65 159.20.2299 14.6726 0.1530 58 59 111.80.1708 0.5409 0.1530 65 66 168.40.1477 11.3585 0.1230 65 67 76.00.1946 9.0458 0.1330 67 68 76.40.1047 0.8620 0.1000 67 69 148.6
Re: [gmx-users] Error in gmx trjcat
> Hi, > > Probably something like an upgrade of the OS or compiler version changed > the libraries that it linked to at run time and somehow the checks to > prevent such problems didn't work. If you would recompile GROMACS on that > machine, it would likely be fine. Hey Mark, Thanks for the suggestion. I'll try recompiling the package. Best, Azeem > > Mark > > On Wed, Apr 18, 2018, 05:37 Syed Azeem <syedazeemullah...@gmail.com> wrote: > >> >> Hey Mark, >> >> >> >> Still the same problem persists. >> >> >> >> >> >> >> > Hi, >> > >> > That looks like a broken build of GROMACS, rather than a bug in the >> > code >> or >> > a broken file. Can you try those gmx check commands on another >> installation >> > somewhere else? (Also gmx check -m doesn't do anything useful any more, >> so >> > don't buy trouble by using it!) >> >> Hey, >> >> I tried concatenating those trajectory files in another machine (same >> version: gromacs 5.1.4). It worked well without any error. >> >> What is wrong with my own build? >> >> Best, >> Azeem >> > >> > Mark >> -- >> Gromacs Users mailing list >> >> * Please search the archive at >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >> posting! >> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >> >> * For (un)subscribe requests visit >> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >> send a mail to gmx-users-requ...@gromacs.org. >> -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Error in gmx trjcat
>> Hey Mark, >> >> Still the same problem persists. >> >> >> > Hi, > > That looks like a broken build of GROMACS, rather than a bug in the code or > a broken file. Can you try those gmx check commands on another installation > somewhere else? (Also gmx check -m doesn't do anything useful any more, so > don't buy trouble by using it!) Hey, I tried concatenating those trajectory files in another machine (same version: gromacs 5.1.4). It worked well without any error. What is wrong with my own build? Best, Azeem > > Mark -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Error in gmx trjcat
0136b000 08:07 2646399 /usr/local/gromacs/lib/libgromacs.so.1.4.0 7fa4f2eb4000-7fa4f2ebe000 rw-p 00:00 0 7fa4f2ebe000-7fa4f2ee4000 r-xp 08:07 2623394 /lib/x86_64-linux-gnu/ld-2.23.so 7fa4f30ba000-7fa4f30c3000 rw-p 00:00 0 7fa4f30e1000-7fa4f30e3000 rw-p 00:00 0 7fa4f30e3000-7fa4f30e4000 r--p 00025000 08:07 2623394 /lib/x86_64-linux-gnu/ld-2.23.so 7fa4f30e4000-7fa4f30e5000 rw-p 00026000 08:07 2623394 /lib/x86_64-linux-gnu/ld-2.23.so 7fa4f30e5000-7fa4f30e6000 rw-p 00:00 0 7fff72b83000-7fff72ba4000 rw-p 00:00 0 [stack] 7fff72bc5000-7fff72bc8000 r--p 00:00 0 [vvar] 7fff72bc8000-7fff72bca000 r-xp 00:00 0 [vdso] ff60-ff601000 r-xp 00:00 0 [vsyscall] Aborted (core dumped) > Hi, > > I meant to run > > gmx check -f 100cat.xtc > > gmx check -f new2.xtc > > on the separate files, to find out e.g. whether they are trajectories that > have matching numbers of particles, etc. > > Mark > > On Tue, Apr 17, 2018 at 1:13 PM Syed Azeem <syedazeemullah...@gmail.com> > wrote: > >> > Hi, >> > >> > What does gmx check report on the two input .xtc files? >> >> Hey Mark, >> >> Thanks for the reply. I did gmx check on both the .xtc files. Now, >> even this command is yielding the same error. >> >> My command: gmx check -f 100cat.xtc -f2 new2.xtc -m report.tex >> >> error: >> >> x[10172] ( 2.76900e+00 2.10100e+00 4.7e-01) - ( 1.20700e+00 >> 2.27200e+00 2.63000e-01) >> x[10173] ( 3.96200e+00 2.91800e+00 2.63000e-01) - ( 5.89000e-01 >> 4.41200e+00 4.86000e-01) >> box[0] ( 4.72250e+00 0.0e+00 0.0e+00) - ( 4.72142e+00 >> 0.0e+00 0.0e+00) >> box[1] ( 0.0e+00 4.72250e+00 0.0e+00) - ( 0.0e+00 >> 4.72142e+00 0.0e+00) >> box[2] ( 0.0e+00 0.0e+00 4.72250e+00) - ( 0.0e+00 >> 0.0e+00 4.72142e+00) >> *** Error in `gmx': munmap_chunk(): invalid pointer: 0x02139380 >> *** >> === Backtrace: = >> /lib/x86_64-linux-gnu/libc.so.6(+0x777e5)[0x7fce776767e5] >> /lib/x86_64-linux-gnu/libc.so.6(cfree+0x1a8)[0x7fce77683698] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(xdr3dfcoord+0x1387)[0x7fce786891d7] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(read_next_xtc+0x1bd)[0x7fce786ab39d] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(read_next_frame+0x267)[0x7fce786a7977] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(comp_trx+0x127)[0x7fce78a0c9e7] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(gmx_check+0x42e)[0x7fce78a016be] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(_ZN3gmx24CommandLineModuleManager3runEiPPc+0x334)[0x7fce784204c4] >> gmx(main+0x8c)[0x40c98c] >> /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fce7761f830] >> gmx(_start+0x29)[0x40cb19] >> === Memory map: >> 0040-0043d000 r-xp 08:07 2646598 >> /usr/local/gromacs/bin/gmx >> 0063c000-0063d000 r--p 0003c000 08:07 2646598 >> /usr/local/gromacs/bin/gmx >> 0063d000-0063e000 rw-p 0003d000 08:07 2646598 >> /usr/local/gromacs/bin/gmx >> 0063e000-0063f000 rw-p 00:00 0 >> 020dc000-0219b000 rw-p 00:00 0 >> [heap] >> 7fce76db4000-7fce76dd5000 r-xp 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7fce76dd5000-7fce76fd4000 ---p 00021000 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7fce76fd4000-7fce76fd5000 r--p 0002 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7fce76fd5000-7fce76fd6000 rw-p 00021000 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7fce76fd6000-7fce76fee000 r-xp 08:07 2623397 >> /lib/x86_64-linux-gnu/libpthread-2.23.so >> 7fce76fee000-7fce771ed000 ---p 00018000 08:07 2623397 >> /lib/x86_64-linux-gnu/libpthread-2.23.so >> 7fce771ed000-7fce771ee000 r--p 00017000 08:07 2623397 >> /lib/x86_64-linux-gnu/libpthread-2.23.so >> 7fce771ee000-7fce771ef000 rw-p 00018000 08:07 2623397 >> /lib/x86_64-linux-gnu/libpthread-2.23.so >> 7fce771ef000-7fce771f3000 rw-p 00:00 0 >> 7fce771f3000-7fce771fa000 r-xp 08:07 2639498 >> /lib/x86_64-linux-gnu/librt-2.23.so >> 7fce771fa000-7fce773f9000 ---p 7000 08:07 2639498 >> /lib/x86_64-linux-gnu/librt-2.23.so >> 7fce773f9000-7fce773fa000 r--p 6000 08:07 2639498 >> /lib/x86_64-linux-gnu/librt-2.23.so >> 7fce773fa000-7fce773fb000 rw-p 7000 08:07 2639
Re: [gmx-users] Error in gmx trjcat
so.6.0.21 7fce78266000-7fce7826a000 rw-p 00:00 0 7fce7826a000-7fce795b7000 r-xp 08:07 2646399 /usr/local/gromacs/lib/libgromacs.so.1.4.0 7fce795b7000-7fce797b6000 ---p 0134d000 08:07 2646399 /usr/local/gromacs/lib/libgromacs.so.1.4.0 7fce797b6000-7fce797d5000 r--p 0134c000 08:07 2646399 /usr/local/gromacs/lib/libgromacs.so.1.4.0 7fce797d5000-7fce797f7000 rw-p 0136b000 08:07 2646399 /usr/local/gromacs/lib/libgromacs.so.1.4.0 7fce797f7000-7fce79801000 rw-p 00:00 0 7fce79801000-7fce79827000 r-xp 08:07 2623394 /lib/x86_64-linux-gnu/ld-2.23.so 7fce799fd000-7fce79a06000 rw-p 00:00 0 7fce79a24000-7fce79a26000 rw-p 00:00 0 7fce79a26000-7fce79a27000 r--p 00025000 08:07 2623394 /lib/x86_64-linux-gnu/ld-2.23.so 7fce79a27000-7fce79a28000 rw-p 00026000 08:07 2623394 /lib/x86_64-linux-gnu/ld-2.23.so 7fce79a28000-7fce79a29000 rw-p 00:00 0 7ffc72305000-7ffc72326000 rw-p 00:00 0 [stack] 7ffc7233b000-7ffc7233e000 r--p 00:00 0 [vvar] 7ffc7233e000-7ffc7234 r-xp 00:00 0 [vdso] ff60-ff601000 r-xp 00:00 0 [vsyscall] Aborted (core dumped) > > Mark > > On Tue, Apr 17, 2018 at 12:06 PM Syed Azeem <syedazeemullah...@gmail.com> > wrote: > >> Hey all, >> >> I'm experiencing an error while using trjcat command. Initially, I ran >> a simulation for 50 ns and then extended till 100ns. After extension, >> I concatenated the trajectories of both the runs using gmx trjcat. >> >> Then, again I extended the run from 100 ns to 500 ns. Finally, I tried >> to concatenate the previously concatenated 100 ns trajectory file with >> the final (500 ns) trajectory. This step yielded an error (pasted >> below). >> >> GROMACS: gmx trjcat, VERSION 5.1.4 >> Executable: /usr/local/gromacs/bin/gmx >> Data prefix: /usr/local/gromacs >> Command line: >> gmx trjcat -f 100cat.xtc new2.xtc -o cat500.xtc >> >> Reading frame 1 time 100010.000 >> >> Summary of files and start times used: >> >> FileStart time Time step >> - >>100cat.xtc0.000 ps 10.000 ps >> new2.xtc 10.000 ps 10.000 ps >> >> Reading frame 0 time0.000 >> >> Continue writing frames from 100cat.xtc t=0 ps, frame=0 >> Last frame 1 time 10.000 -> frame 9000 time 9.000 >> ps >> Reading frame 0 time 10.000 >> lasttime 0 >> >> Continue writing frames from new2.xtc t=10 ps, frame=1 >> *** Error in `gmx': munmap_chunk(): invalid pointer: >> 0x01455380 ***s >> === Backtrace: = >> /lib/x86_64-linux-gnu/libc.so.6(+0x777e5)[0x7f929a9a07e5] >> /lib/x86_64-linux-gnu/libc.so.6(cfree+0x1a8)[0x7f929a9ad698] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(xdr3dfcoord+0x1387)[0x7f929b9b31d7] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(read_next_xtc+0x1bd)[0x7f929b9d539d] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(read_next_frame+0x267)[0x7f929b9d1977] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(gmx_trjcat+0x32c2)[0x7f929bb64842] >> >> /usr/local/gromacs/bin/../lib/libgromacs.so.1(_ZN3gmx24CommandLineModuleManager3runEiPPc+0x334)[0x7f929b74a4c4] >> gmx(main+0x8c)[0x40c98c] >> /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f929a949830] >> gmx(_start+0x29)[0x40cb19] >> === Memory map: >> 0040-0043d000 r-xp 08:07 2646598 >> /usr/local/gromacs/bin/gmx >> 0063c000-0063d000 r--p 0003c000 08:07 2646598 >> /usr/local/gromacs/bin/gmx >> 0063d000-0063e000 rw-p 0003d000 08:07 2646598 >> /usr/local/gromacs/bin/gmx >> 0063e000-0063f000 rw-p 00:00 0 >> 013f8000-014b7000 rw-p 00:00 0 >> [heap] >> 7f929a0de000-7f929a0ff000 r-xp 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7f929a0ff000-7f929a2fe000 ---p 00021000 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7f929a2fe000-7f929a2ff000 r--p 0002 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7f929a2ff000-7f929a30 rw-p 00021000 08:07 2098798 >> /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 >> 7f929a30-7f929a318000 r-xp 08:07 2623397 >> /lib/x86_64-linux-gnu/libpthread-2.23.so >> 7f929a318000-7f929a517000 ---p 00018000 08:07 2623397 >> /lib/x86_64-linux-gnu/libpthread-2.23.so >> 7f929a517000-7
[gmx-users] Error in gmx trjcat
Hey all, I'm experiencing an error while using trjcat command. Initially, I ran a simulation for 50 ns and then extended till 100ns. After extension, I concatenated the trajectories of both the runs using gmx trjcat. Then, again I extended the run from 100 ns to 500 ns. Finally, I tried to concatenate the previously concatenated 100 ns trajectory file with the final (500 ns) trajectory. This step yielded an error (pasted below). GROMACS: gmx trjcat, VERSION 5.1.4 Executable: /usr/local/gromacs/bin/gmx Data prefix: /usr/local/gromacs Command line: gmx trjcat -f 100cat.xtc new2.xtc -o cat500.xtc Reading frame 1 time 100010.000 Summary of files and start times used: FileStart time Time step - 100cat.xtc0.000 ps 10.000 ps new2.xtc 10.000 ps 10.000 ps Reading frame 0 time0.000 Continue writing frames from 100cat.xtc t=0 ps, frame=0 Last frame 1 time 10.000 -> frame 9000 time 9.000 ps Reading frame 0 time 10.000 lasttime 0 Continue writing frames from new2.xtc t=10 ps, frame=1 *** Error in `gmx': munmap_chunk(): invalid pointer: 0x01455380 ***s === Backtrace: = /lib/x86_64-linux-gnu/libc.so.6(+0x777e5)[0x7f929a9a07e5] /lib/x86_64-linux-gnu/libc.so.6(cfree+0x1a8)[0x7f929a9ad698] /usr/local/gromacs/bin/../lib/libgromacs.so.1(xdr3dfcoord+0x1387)[0x7f929b9b31d7] /usr/local/gromacs/bin/../lib/libgromacs.so.1(read_next_xtc+0x1bd)[0x7f929b9d539d] /usr/local/gromacs/bin/../lib/libgromacs.so.1(read_next_frame+0x267)[0x7f929b9d1977] /usr/local/gromacs/bin/../lib/libgromacs.so.1(gmx_trjcat+0x32c2)[0x7f929bb64842] /usr/local/gromacs/bin/../lib/libgromacs.so.1(_ZN3gmx24CommandLineModuleManager3runEiPPc+0x334)[0x7f929b74a4c4] gmx(main+0x8c)[0x40c98c] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7f929a949830] gmx(_start+0x29)[0x40cb19] === Memory map: 0040-0043d000 r-xp 08:07 2646598 /usr/local/gromacs/bin/gmx 0063c000-0063d000 r--p 0003c000 08:07 2646598 /usr/local/gromacs/bin/gmx 0063d000-0063e000 rw-p 0003d000 08:07 2646598 /usr/local/gromacs/bin/gmx 0063e000-0063f000 rw-p 00:00 0 013f8000-014b7000 rw-p 00:00 0 [heap] 7f929a0de000-7f929a0ff000 r-xp 08:07 2098798 /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 7f929a0ff000-7f929a2fe000 ---p 00021000 08:07 2098798 /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 7f929a2fe000-7f929a2ff000 r--p 0002 08:07 2098798 /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 7f929a2ff000-7f929a30 rw-p 00021000 08:07 2098798 /usr/lib/x86_64-linux-gnu/libgomp.so.1.0.0 7f929a30-7f929a318000 r-xp 08:07 2623397 /lib/x86_64-linux-gnu/libpthread-2.23.so 7f929a318000-7f929a517000 ---p 00018000 08:07 2623397 /lib/x86_64-linux-gnu/libpthread-2.23.so 7f929a517000-7f929a518000 r--p 00017000 08:07 2623397 /lib/x86_64-linux-gnu/libpthread-2.23.so 7f929a518000-7f929a519000 rw-p 00018000 08:07 2623397 /lib/x86_64-linux-gnu/libpthread-2.23.so 7f929a519000-7f929a51d000 rw-p 00:00 0 7f929a51d000-7f929a524000 r-xp 08:07 2639498 /lib/x86_64-linux-gnu/librt-2.23.so 7f929a524000-7f929a723000 ---p 7000 08:07 2639498 /lib/x86_64-linux-gnu/librt-2.23.so 7f929a723000-7f929a724000 r--p 6000 08:07 2639498 /lib/x86_64-linux-gnu/librt-2.23.so 7f929a724000-7f929a725000 rw-p 7000 08:07 2639498 /lib/x86_64-linux-gnu/librt-2.23.so 7f929a725000-7f929a728000 r-xp 08:07 2623462 /lib/x86_64-linux-gnu/libdl-2.23.so 7f929a728000-7f929a927000 ---p 3000 08:07 2623462 /lib/x86_64-linux-gnu/libdl-2.23.so 7f929a927000-7f929a928000 r--p 2000 08:07 2623462 /lib/x86_64-linux-gnu/libdl-2.23.so 7f929a928000-7f929a929000 rw-p 3000 08:07 2623462 /lib/x86_64-linux-gnu/libdl-2.23.so 7f929a929000-7f929aae9000 r-xp 08:07 2623402 /lib/x86_64-linux-gnu/libc-2.23.so 7f929aae9000-7f929ace9000 ---p 001c 08:07 2623402 /lib/x86_64-linux-gnu/libc-2.23.so 7f929ace9000-7f929aced000 r--p 001c 08:07 2623402 /lib/x86_64-linux-gnu/libc-2.23.so 7f929aced000-7f929acef000 rw-p 001c4000 08:07 2623402 /lib/x86_64-linux-gnu/libc-2.23.so 7f929acef000-7f929acf3000 rw-p 00:00 0 7f929acf3000-7f929ad09000 r-xp 08:07 2626206 /lib/x86_64-linux-gnu/libgcc_s.so.1 7f929ad09000-7f929af08000 ---p 00016000 08:07 2626206 /lib/x86_64-linux-gnu/libgcc_s.so.1 7f929af08000-7f929af09000 rw-p 00015000 08:07 2626206 /lib/x86_64-linux-gnu/libgcc_s.so.1 7f929af09000-7f929b011000 r-xp 08:07 2621457 /lib/x86_64-linux-gnu/libm-2.23.so 7f929b011000-7f929b21 ---p 00108000 08:07 2621457 /lib/x86_64-linux-gnu/libm-2.23.so 7f929b21-7f929b211000 r--p 00107000 08:07 2621457 /lib/x86_64-linux-gnu/libm-2.23.so 7f929b211000-7f929b212000 rw-p 00108000 08:07 2621457 /lib/x86_64-linux-gnu/libm-2.23.so
[gmx-users] gmx gpu error gmx: error libnvidia-ml.so.1
Hi all, I tried installing Gromacs 2016.4 on a new hardware with Nvidia Quadro K420 2gb support. Installation completed successfully following Quick and Dirty method. But gmx is not working. Error says: gmx: error while loading shared libraries: libnvidia-ml.so.1: cannot open shared object file: No such file or directory My cmake command: cmake '/home/tt635kkb/Documents/gromacs-2016.4' -DGMX_BUILD_OWN_FFTW=ON -DGMX_SIMD=AVX2_256 -DGMX_GPU=on -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda-9.0 For the toolkit, I did runfile installation from Nvidia. Please help me out with the error Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] New files after simulation run
> Hi All, > > I have run an extended simulation for 90ns like this: (my previous run was > for 10ns) > > grompp -f new.mdp -c old.tpr -o new.tpr > mdrun -s new.tpr -cpi old.cpt > > I output mdrun STDOUT to an output file and it looks like this: > > ## > > Back Off! I just backed up traj.trr to ./#traj.trr.2# > > Back Off! I just backed up traj_comp.xtc to ./#traj_comp.xtc.2# > > Back Off! I just backed up ener.edr to ./#ener.edr.2# > starting mdrun 'Protein in water' > 9000 steps, 9.0 ps (continuing from step 1000, 1.0 ps). > > NOTE: Turning on dynamic load balancing > > > Writing final coordinates. > > Average load imbalance: 2.3 % > Part of the total run time spent waiting due to load imbalance: 0.8 % > Steps where the load balancing was limited by -rdd, -rcon and/or -dds: X 0 > % Y 0 % > Average PME mesh/force load: 0.716 > Part of the total run time spent waiting due to PP/PME imbalance: 3.3 % > > >Core t (s) Wall t (s)(%) >Time: 9377991.988 379495.043 2471.2 > 4d09h24:55 > (ns/day)(hour/ns) > Performance: 18.2141.318 > > ## > > I was expecting the new files to be generated with names "new.xtc".. > "new.edr" ...etc, but I see in the output file the files generated are : > (Moreover, there is no "new.log" file after the run finished) You didn't pass the 'default filename' flag (-deffnm) in the mdrun command. > > traj_comp.xtc, traj.trr and ener.edr. > > Can someone please tell me why this happened or if something wrong has > happened during the run? > > Thanks, > DK -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Related to fluctuation in Pressure
> Dear all, > I have done md simulation at 1 bar pressure but after the simulation when i > plot the pressure graph, plot shows fluctuation range from -200 to 200 bar. > Is this fluctuation is acceptable? Hey, This is quite normal. see: http://www.gromacs.org/Documentation/Terminology/Pressure Do look out for average pressure over the equilibration time scale. Azeem > Thanks in advance > > -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Slow MDRUN performance
> > Hi, > > On Wed, Jun 21, 2017 at 12:47 PM Syed Azeem <syedazeemullah...@gmail.com> > wrote: > >> Hi all, >> >> I installed gromacs 5.1.2 on a new machine. >> Config: Intel Core i5 6500 3.20 GHz Quad Core Skylake with 8 GB DDR4. >> >> Then I started a NVT simulation for 500ps with ~25 atom system. >> Surprisingly, the estimated time was 4 days and 3 hours. >> >> There was a note stating consider rebuilding gmx with the >> GMX_USE_RDTSCP=ON CMake option, as follows: >> >> GROMACS: gmx mdrun, VERSION 5.1.2 >> Executable: /usr/local/gromacs/bin/gmx >> Data prefix: /usr/local/gromacs >> Command line: >> gmx mdrun -v -deffnm nvt >> >> >> Running on 1 node with total 4 cores, 4 logical cores >> Hardware detected: >> CPU info: >> Vendor: GenuineIntel >> Brand: Intel(R) Core(TM) i5-6500 CPU @ 3.20GHz >> SIMD instructions most likely to fit this hardware: AVX2_256 >> SIMD instructions selected at GROMACS compile time: None >> >> Compiled SIMD instructions: None, GROMACS could use AVX2_256 on this >> machine, which is better >> >> >> The current CPU can measure timings more accurately than the code in >> gmx was configured to use. This might affect your simulation >> speed as accurate timings are needed for load-balancing. >> Please consider rebuilding gmx with the GMX_USE_RDTSCP=ON CMake option. >> >> Reading file em.tpr, VERSION 5.1.2 (single precision) >> Using 1 MPI thread >> Using 4 OpenMP threads >> >> >> WARNING: Using the slow plain C kernels. This should >> not happen during routine usage on supported platforms. >> >> Then, I rebuilt gromacs with the CMAKE GMX_USE_RDTSCP=ON, as suggested. >> >> Again the performance is slow with the same note being generated. >> >> PS: For comparison, I ran 500 ps NVT simulation of the same ~25 >> atom system on another machine of the same config and finished within >> 7 hours. >> >> What mistake am I making in the installation? >> > > We can't tell, because you didn't tell us about how the installation was > done. ;-) But somehow everything useful got turned off, so don't do that. > You should use the quick and dirty guide at > http://manual.gromacs.org/documentation/5.1.4/install-guide/index.html, > configure on the node you'll run on, and install the latest in the 5.1 > series, not an older one with extra bugs ;-) > > Mark Thanks for the reply, Mark. I followed the quick and dirty installation from the gromacs installation documentation. Now I rectified the problem by rebuilding GMX with GMX_USE_RDTSCP=ON and -DGMX_SIMD=AVE2_256, which gromacs suggested would be better for my machine. Azeem > > >> Thanks in advance, >> >> Azeem >> -- -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
[gmx-users] Slow MDRUN performance
Hi all, I installed gromacs 5.1.2 on a new machine. Config: Intel Core i5 6500 3.20 GHz Quad Core Skylake with 8 GB DDR4. Then I started a NVT simulation for 500ps with ~25 atom system. Surprisingly, the estimated time was 4 days and 3 hours. There was a note stating consider rebuilding gmx with the GMX_USE_RDTSCP=ON CMake option, as follows: GROMACS: gmx mdrun, VERSION 5.1.2 Executable: /usr/local/gromacs/bin/gmx Data prefix: /usr/local/gromacs Command line: gmx mdrun -v -deffnm nvt Running on 1 node with total 4 cores, 4 logical cores Hardware detected: CPU info: Vendor: GenuineIntel Brand: Intel(R) Core(TM) i5-6500 CPU @ 3.20GHz SIMD instructions most likely to fit this hardware: AVX2_256 SIMD instructions selected at GROMACS compile time: None Compiled SIMD instructions: None, GROMACS could use AVX2_256 on this machine, which is better The current CPU can measure timings more accurately than the code in gmx was configured to use. This might affect your simulation speed as accurate timings are needed for load-balancing. Please consider rebuilding gmx with the GMX_USE_RDTSCP=ON CMake option. Reading file em.tpr, VERSION 5.1.2 (single precision) Using 1 MPI thread Using 4 OpenMP threads WARNING: Using the slow plain C kernels. This should not happen during routine usage on supported platforms. Then, I rebuilt gromacs with the CMAKE GMX_USE_RDTSCP=ON, as suggested. Again the performance is slow with the same note being generated. PS: For comparison, I ran 500 ps NVT simulation of the same ~25 atom system on another machine of the same config and finished within 7 hours. What mistake am I making in the installation? Thanks in advance, Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Reg: creation of separate chains in .pdb file
> On 5/29/17 8:00 AM, Syed Azeem wrote: >> Hey all, >> >> I simulated a protein-peptide docked complex. Post simulation, I >> created an index file selecting only the Protein Group >> (protein-peptide complex). Then using editconf, I created a .pdb file >> for the same. >> >> When I view the prtn.pdb file, only the protein is available but not >> the peptide. Still the prtn.pdb file has coordinates for peptide as >> well. The pdb file also lacks a chain identifier, which was present in > > If you're having trouble viewing the coordinates, that's a problem with the > > viewer itself. You say that the coordinates are there (as they should be) > so > there's no reason they can't be visualized if rendered properly. > >> the initial structure. >> >> How to overcome this? >> > > Add suitable chain identifiers back into the coordinate file. Hey Justin, I tried adding chain identifiers naming A & B to the file. It creates another linear peptide chain away from main complex, when I view it in Pymol. > > -Justin > > -- > == > > Justin A. Lemkul, Ph.D. > Ruth L. Kirschstein NRSA Postdoctoral Fellow > > Department of Pharmaceutical Sciences > School of Pharmacy > Health Sciences Facility II, Room 629 > University of Maryland, Baltimore > 20 Penn St. > Baltimore, MD 21201 > > jalem...@outerbanks.umaryland.edu | (410) 706-7441 > http://mackerell.umaryland.edu/~jalemkul > > == -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
[gmx-users] Reg: creation of separate chains in .pdb file
Hey all, I simulated a protein-peptide docked complex. Post simulation, I created an index file selecting only the Protein Group (protein-peptide complex). Then using editconf, I created a .pdb file for the same. When I view the prtn.pdb file, only the protein is available but not the peptide. Still the prtn.pdb file has coordinates for peptide as well. The pdb file also lacks a chain identifier, which was present in the initial structure. How to overcome this? Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Grommp Fatal Error (Syed Azeem)
Hey Mark, Thanks for the reply. I even tried other force fields, which yielded the same errors. For Gromos 53a6 ff, "No default Ryckaert-Bell types" was included in the error. > Hi, > > grompp will always need angle type definitions for the angles in the .itp > file - they have to either be in the force field already, or added to it. > If they are not present in amber99sb-ildn, then you need to understand how > the topologies were generated and how they are intended to be used in order > to make an appropriate choice. > > Mark > > On Fri, May 5, 2017 at 12:48 PM Syed Azeem <syedazeemullah...@gmail.com> > wrote: > >> Hi all, >> >> I'm trying to simulate a Protein-Peptide docked complex. >> I used Amber99sb-ildn ff with TIP3P. Since mine is a docked complex, >> two separate topology and position restraint files were created as >> chains A & B, automatically. >> >> When I tried to generate the .tpr file for adding ions to the system, >> I ended up with 3 errors in the topology file of chain A. The error >> corresponded to [angles] and [dihedrals] of chain A. >> >> Error message: >> >> ERROR 1 [file topol_Protein_chain_A.itp, line 46901]: >> No default Angle types >> >> >> ERROR 2 [file topol_Protein_chain_A.itp, line 65796]: >> No default Proper Dih. types >> >> >> ERROR 3 [file topol_Protein_chain_A.itp, line 65798]: >> No default Proper Dih. types >> >> How to overcome this error? >> >> Thanks in advance >> >> Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
[gmx-users] Grommp Fatal Error
Hi all, I'm trying to simulate a Protein-Peptide docked complex. I used Amber99sb-ildn ff with TIP3P. Since mine is a docked complex, two separate topology and position restraint files were created as chains A & B, automatically. When I tried to generate the .tpr file for adding ions to the system, I ended up with 3 errors in the topology file of chain A. The error corresponded to [angles] and [dihedrals] of chain A. Error message: ERROR 1 [file topol_Protein_chain_A.itp, line 46901]: No default Angle types ERROR 2 [file topol_Protein_chain_A.itp, line 65796]: No default Proper Dih. types ERROR 3 [file topol_Protein_chain_A.itp, line 65798]: No default Proper Dih. types How to overcome this error? Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] PDB2GMX Fatal Error
>>> Use -ignh in pdb2gmx. >> Hi Reza, >> >> I tried -ignh, it's working fine. >> >> But i need to calculate H-bonds after docking the same peptide. Then >> the same error will crop up and I won't be able to calculate H-bonds. >> > > The use of -ignh has nothing to do with whether or not you can calculate > hydrogen bonds. > > The more complete explanation about -ignh is that H atoms in the input > coordinates are ignored, then rebuilt according to the force field's .hdb > file. > Your problem was that your H atom nomenclature was non-standard, or at least > did > not conform to the force field's expectations, so you got a fatal error. > You > can easily confirm in the topology and coordinates produced when using -ignh > > that in fact all of the H you need are there. pdb2gmx just rebuilt them so > you > didn't have to waste time renaming all your non-conforming H atoms. > > -Justin Now I understood the -ignh option. Thanks for the explanation -Azeem >>>> On Mar 1, 2017, at 10:22, Syed Azeem <syedazeemullah...@gmail.com> >>>> wrote: >>>> >>>> Hi all, >>>> >>>> I tried passing a predicted peptide (16-mer) into GMX and ended up >>>> with a fatal error regarding hydrogen. I tried ignoring the hydrogens >>>> using -ignh command. But I'll need to calculate H-bonds for the next >>>> analysis, as I'll dock this peptide into a target protein. >>>> >>>> Fatal error: >>>> Atom HB3 in residue ASN 1 was not found in rtp entry NASN with 16 atoms >>>> while sorting atoms. >>>> >>>> For a hydrogen, this can be a different protonation state, or it >>>> might have had a different number in the PDB file and was rebuilt >>>> (it might for instance have been H3, and we only expected H1 & H2). >>>> Note that hydrogens might have been added to the entry for the >>>> N-terminus. >>>> Remove this hydrogen or choose a different protonation state to solve >>>> it. >>>> Option -ignh will ignore all hydrogens in the input. >>>> >>>> How to solve this issue? >>>> >>>> Thanks in advance >>>> >>>> Azeem > > -- > == > > Justin A. Lemkul, Ph.D. > Ruth L. Kirschstein NRSA Postdoctoral Fellow > > Department of Pharmaceutical Sciences > School of Pharmacy > Health Sciences Facility II, Room 629 > University of Maryland, Baltimore > 20 Penn St. > Baltimore, MD 21201 > > jalem...@outerbanks.umaryland.edu | (410) 706-7441 > http://mackerell.umaryland.edu/~jalemkul > > == -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] PDB2GMX Fatal Error
> > Use -ignh in pdb2gmx. Hi Reza, I tried -ignh, it's working fine. But i need to calculate H-bonds after docking the same peptide. Then the same error will crop up and I won't be able to calculate H-bonds. >> On Mar 1, 2017, at 10:22, Syed Azeem <syedazeemullah...@gmail.com> wrote: >> >> Hi all, >> >> I tried passing a predicted peptide (16-mer) into GMX and ended up >> with a fatal error regarding hydrogen. I tried ignoring the hydrogens >> using -ignh command. But I'll need to calculate H-bonds for the next >> analysis, as I'll dock this peptide into a target protein. >> >> Fatal error: >> Atom HB3 in residue ASN 1 was not found in rtp entry NASN with 16 atoms >> while sorting atoms. >> >> For a hydrogen, this can be a different protonation state, or it >> might have had a different number in the PDB file and was rebuilt >> (it might for instance have been H3, and we only expected H1 & H2). >> Note that hydrogens might have been added to the entry for the >> N-terminus. >> Remove this hydrogen or choose a different protonation state to solve it. >> Option -ignh will ignore all hydrogens in the input. >> >> How to solve this issue? >> >> Thanks in advance >> >> Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
[gmx-users] PDB2GMX Fatal Error
Hi all, I tried passing a predicted peptide (16-mer) into GMX and ended up with a fatal error regarding hydrogen. I tried ignoring the hydrogens using -ignh command. But I'll need to calculate H-bonds for the next analysis, as I'll dock this peptide into a target protein. Fatal error: Atom HB3 in residue ASN 1 was not found in rtp entry NASN with 16 atoms while sorting atoms. For a hydrogen, this can be a different protonation state, or it might have had a different number in the PDB file and was rebuilt (it might for instance have been H3, and we only expected H1 & H2). Note that hydrogens might have been added to the entry for the N-terminus. Remove this hydrogen or choose a different protonation state to solve it. Option -ignh will ignore all hydrogens in the input. How to solve this issue? Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Implicit and Explicit solvation
Hi Justin, Thank you so much for the info. I'll perform the simulation with explicit solvent and calculate the free energy. Azeem > MM/PBSA is a post-processing (analysis) method in which you provide existing > > snapshots and the MM energy is combined with a PBSA estimate of solvation > free > energy to give the free energy of the system. The (unofficial) g_mmpbsa > tool > does this. It does not require additional simulations. > > -Justin > >> >>> From: Timofey Tyugashev <tyugas...@niboch.nsc.ru> >>> To: gromacs.org_gmx-users@maillist.sys.kth.se >>> Subject: Re: [gmx-users] Implicit and Explicit solvation >>> Message-ID: <17ce8acf-3566-f23f-3931-77ed00c37...@niboch.nsc.ru> >>> Content-Type: text/plain; charset=UTF-8; format=flowed >>> >>> Implicit solvation treats solvent as a continuous medium (a kind of >>> modifier for , while explicit one treats it as a set of explicit >>> (surprise!) particles. I guess you can start reading from relevant >>> Wikipedia articles. >>> Implicit is way faster to compute, the system itself also settles far >>> quicker, but it's considered more inaccurate and harder to optimize for >>> GPU. It's also possible to have a hybrid approach, when the whole system >>> is solvated implicitly, but select solvent molecules are explicitly >>> included in the model. >>> >>> In GROMACS implicit solvent is depreciated, so you should either use >>> explicit solvent, or pick a different MD engine if you want implicit. >>> 23.12.2016 18:00, gromacs.org_gmx-users-requ...@maillist.sys.kth.se >>> ?: >>> Message: 1 >>> Date: Fri, 23 Dec 2016 16:25:33 +0530 >>> From: Syed Azeem<syedazeemullah...@gmail.com> >>> To: "gromacs.org_gmx-users" >>> <gromacs.org_gmx-users@maillist.sys.kth.se> >>> Subject: [gmx-users] Implicit and Explicit solvation >>> Message-ID: >>> >>> <cajno2sbr39h+pgx0cb-fcfbvrgisrcgqe0ddxdj4_f1ph2e...@mail.gmail.com> >>> Content-Type: text/plain; charset=UTF-8 >>> >>> Hi all, >>> >>> What's the difference between Implicit and Explicit solvation? >>> Is there any difference in setting up the system for simulation? >>> Which is computationally effiecient? >>> >>> Thanks in advance >>> >>> Azeem > > -- > == -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Implicit and Explicit solvation
Thanks Tyugashev for your response. It was a reviewer suggestion to run our protein-peptide docked complex in a continuum solvent and compute their free energies of association. After this, I came across MMPBSA and its GROMACS tool g_mmpbsa. Is this not based on Implicit solvation? How can I perform it to validate my docking? Thanks in advance Azeem >From: Timofey Tyugashev <tyugas...@niboch.nsc.ru> >To: gromacs.org_gmx-users@maillist.sys.kth.se >Subject: Re: [gmx-users] Implicit and Explicit solvation >Message-ID: <17ce8acf-3566-f23f-3931-77ed00c37...@niboch.nsc.ru> >Content-Type: text/plain; charset=UTF-8; format=flowed > >Implicit solvation treats solvent as a continuous medium (a kind of >modifier for , while explicit one treats it as a set of explicit >(surprise!) particles. I guess you can start reading from relevant >Wikipedia articles. >Implicit is way faster to compute, the system itself also settles far >quicker, but it's considered more inaccurate and harder to optimize for >GPU. It's also possible to have a hybrid approach, when the whole system >is solvated implicitly, but select solvent molecules are explicitly >included in the model. > >In GROMACS implicit solvent is depreciated, so you should either use >explicit solvent, or pick a different MD engine if you want implicit. >23.12.2016 18:00, gromacs.org_gmx-users-requ...@maillist.sys.kth.se ?????: > Message: 1 > Date: Fri, 23 Dec 2016 16:25:33 +0530 > From: Syed Azeem<syedazeemullah...@gmail.com> > To: "gromacs.org_gmx-users" > <gromacs.org_gmx-users@maillist.sys.kth.se> > Subject: [gmx-users] Implicit and Explicit solvation > Message-ID: > <cajno2sbr39h+pgx0cb-fcfbvrgisrcgqe0ddxdj4_f1ph2e...@mail.gmail.com> > Content-Type: text/plain; charset=UTF-8 > > Hi all, > > What's the difference between Implicit and Explicit solvation? > Is there any difference in setting up the system for simulation? > Which is computationally effiecient? > > Thanks in advance > > Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
[gmx-users] Implicit and Explicit solvation
Hi all, What's the difference between Implicit and Explicit solvation? Is there any difference in setting up the system for simulation? Which is computationally effiecient? Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] gromacs.org_gmx-users Digest, Vol 152, Issue 93
Hi Micholas, Thanks for your consideration. As you said, I'd be lucky if I find such a reference :-) None of the articles that I came across, explained the reason behind selection of force fields and other parameters. Azeem > Hi Azeem, > > At this point in time, it is pretty much a trial and error kind method. You > may have some luck with a literature search to see if anyone has performed a > benchmark on your system, or a similar one, and that may guide you a bit. > > -Micholas > > === > Micholas Dean Smith, PhD. > Post-doctoral Research Associate > University of Tennessee/Oak Ridge National Laboratory > Center for Molecular Biophysics > > >> From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se >> <gromacs.org_gmx-users-boun...@maillist.sys.kth.se> on behalf of Syed Azeem >> <syedazeemullah...@gmail.com> >> Sent: Thursday, December 22, 2016 8:51 AM >> To: gromacs.org_gmx-users >> Subject: [gmx-users] Force Fields Selection >> >> Hi all, >> >> What are criteria for choosing the force fields? >> >> Is choosing a force field for a particular protein molecule, a trial >> and error method based? >> >> Thanks in advance >> >> Azeem >> -- >> Gromacs Users mailing list >> -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
[gmx-users] Force Fields Selection
Hi all, What are criteria for choosing the force fields? Is choosing a force field for a particular protein molecule, a trial and error method based? Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Energy minimization steps
Thank you Alex and Mark for your replies. :-) Azeem > Hi, > > How many steps does it take you to walk down a mountain to a certain > village? :-) Depends where you are on the mountain, and what's in the way. > But if all you need is to be somewhere near the valley floor to start > equilibration, anything goes! > > Mark > > On Wed, 21 Dec 2016 17:11 Alex <nedoma...@gmail.com> wrote: > >> This isn't a Gromacs-specific parameter, or, for that matter, anything >> that straightforwardly depends on the nature of the system >> (solid/fluid/protein/lipid, etc) and its size, aside from maybe trivial >> cases. The max number of minimization steps is something that limits an >> energy minimization attempt, given the minimization algorithm, >> tolerance, the energy step, and, of course, how well-behaved you expect >> your system to be. It is a reasonable trial and error guess aimed at >> computational efficiency, e.g. not exceeding a certain amount of >> computational burden for really bad structures. Alternatively, limiting >> the number of minimization steps can help not produce some kind of a >> freaky structure when you have several local minima nearby. >> >> >> Alex >> >> >> On 12/20/2016 10:51 PM, Syed Azeem wrote: >> > Hi all, >> > >> > What is the basis of inputting the maximum number of energy >> > minimization steps in GROMACS? >> > Does maximum number of energy minimization steps depend on the number >> > of residues? >> > >> > I came across many articles wherein the authors have described the >> > number of energy minimization steps within which they have minimized >> > their protein. >> > >> > I have a protein of 596 residues to be simulated. >> > >> > Thanks in advance >> > >> > Azeem >> -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
Re: [gmx-users] Protein-Peptide Simulation help
t;>>> >>>>>> I am using Acpype server to generate topology file of drug molecule >>>>>> to >>>>>> perform protein drug MD simulation. Acpype have generated the itp >>>>>> file >>>>> >>>>> and >>>>>> >>>>>> other necessary for MD simulations. I am doing MD on gromacs 5.1.4 >>>>> >>>>> software >>>>>> >>>>>> using amber99sb force field. When I have done energy minimization >>>>>> some >>>>>> unwanted bonds are formed in drug molecule. So the position restraint >> is >>>>>> not able to execute. The problem is due to the unusual bond in drug >> atom. >>>>>> When I used the itp file of PRODRG server position restraint goes >>>>>> successfully. I have used the gromos54a7 ff. But there is problem >>>>>> with >>>>> >>>>> the >>>>>> >>>>>> charge present on the atom that comes from PRODRG. >>>>>> >>>>>> Please tell me how charge on the atom can be corrected in itp file of >>>>>> PRODRG. How the problem in energy minimization with the Acpype itp >> will >>>>> >>>>> be >>>>>> >>>>>> resolved. >>>>>> >>>>>> Thanks in Advance >>>>>> With Regards >>>>>> Tasneem >>>>>> -- >>>>>> Gromacs Users mailing list >>>>>> >>>>>> * Please search the archive at >>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >>>>>> posting! >>>>>> >>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >>>>>> >>>>>> * For (un)subscribe requests visit >>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >>>>>> send a mail to gmx-users-requ...@gromacs.org. >>>>>> >>>>> -- >>>>> Gromacs Users mailing list >>>>> >>>>> * Please search the archive at http://www.gromacs.org/ >>>>> Support/Mailing_Lists/GMX-Users_List before posting! >>>>> >>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >>>>> >>>>> * For (un)subscribe requests visit >>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >>>>> send a mail to gmx-users-requ...@gromacs.org. >>>>> >>> >>> -- >>> == >>> >>> Justin A. Lemkul, Ph.D. >>> Ruth L. Kirschstein NRSA Postdoctoral Fellow >>> >>> Department of Pharmaceutical Sciences >>> School of Pharmacy >>> Health Sciences Facility II, Room 629 >>> University of Maryland, Baltimore >>> 20 Penn St. >>> Baltimore, MD 21201 >>> >>> jalem...@outerbanks.umaryland.edu | (410) 706-7441 >>> http://mackerell.umaryland.edu/~jalemkul >>> >>> == >>> >>> -- >>> Gromacs Users mailing list >>> >>> * Please search the archive at >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before >> posting! >>> >>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists >>> >>> * For (un)subscribe requests visit >>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or >> send a mail to gmx-users-requ...@gromacs.org. >> > > -- > == > > Justin A. Lemkul, Ph.D. > Ruth L. Kirschstein NRSA Postdoctoral Fellow > > Department of Pharmaceutical Sciences > School of Pharmacy > Health Sciences Facility II, Room 629 > University of Maryland, Baltimore > 20 Penn St. > Baltimore, MD 21201 > > jalem...@outerbanks.umaryland.edu | (410) 706-7441 > http://mackerell.umaryland.edu/~jalemkul > > == > > > -- > > Message: 2 > Date: Tue, 20 Dec 2016 21:19:05 -0500 > From: Justin Lemkul <jalem...@vt.edu> > To: gmx-us...@gromacs.org > Subject: Re: [gmx-users] Protein-Peptide Simulation help > Message-ID: <b9a5f725-6094-d0ca-5689-f57fe54a7...@vt.edu> > Content-Type: text/plain; charset=windows-1252; f
[gmx-users] Energy minimization steps
Hi all, What is the basis of inputting the maximum number of energy minimization steps in GROMACS? Does maximum number of energy minimization steps depend on the number of residues? I came across many articles wherein the authors have described the number of energy minimization steps within which they have minimized their protein. I have a protein of 596 residues to be simulated. Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.
[gmx-users] Protein-Peptide Simulation help
Hello All, I'm trying to simulate a Protein-Peptide complex using GROMACS 5.1.4. For reference, I referred Justin Lemkul's Tutorial for Protein Ligand complexes. In which, the topology for ligand (small molecule) was generated using PRODRG. In my case, do I need to treat my peptide (16-mer) in such a way? Or just directly simulating the protein-peptide complex is good enough? Thanks in advance Azeem -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-requ...@gromacs.org.