Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories
Hi Carsten, Thanks for your reply. Actually I am running MD simulation on a protein molecule with 270 residues(2687 atoms), after adding water it is having 45599 atoms, and using the recent version of gromacs test available from gromacs.org (gmxtest-3.3.3.tgz) Following are the entries from the .mdp file I am using. **md.mdp title = trp_drg MD cpp = /lib/cpp ; location of cpp on SGI constraints = all-bonds integrator = md dt = 0.002 ; ps ! nsteps = 25000 ; total 50 ps. nstcomm = 1 nstxout = 2500 ; output coordinates every 5.0 ps nstvout = 0 nstfout = 0 nstlist = 5 ns_type = grid rlist = 0.9 coulombtype = PME rcoulomb= 0.9 rvdw= 1.4 fourierspacing = 0.12 fourier_nx= 0 fourier_ny= 0 fourier_nz= 0 pme_order = 6 ewald_rtol= 1e-5 optimize_fft = yes ; Berendsen temperature coupling is on in four groups Tcoupl= berendsen tau_t = 0.1 0.1 0.1 tc-grps = protein NDP sol ref_t = 300 300 300 ; Pressure coupling is on Pcoupl = berendsen pcoupltype = isotropic tau_p = 0.5 compressibility = 4.5e-5 ref_p = 1.0 ; Generate velocites is on at 300 K. gen_vel = yes gen_temp= 300.0 gen_seed= 173529 and Following are the commands I am using grompp_d -np 128 -f md1.mdp -c 1XU9_A_b4em.gro -p 1XU9_A.top -o 1XU9_A_md1_np128.tpr submit mdrun_d /arguement for mdrun_d -s 1XU9_A_md1_np128.tpr -o 1XU9_A_md1_np128.trr -c 1XU9_A_pmd1_np128.gro -g md_np128.log -e md_np128.edr -np 128 ***Following is the error I am getting Reading file 1XU9_A_md1_np128.tpr, VERSION 3.3.3 (double precision) starting mdrun 'CORTICOSTEROID 11-BETA-DEHYDROGENASE, ISOZYME 1' 25000 steps, 50.0 ps. srun: error: n141: task1: Segmentation fault srun: Terminating job Is this information is helpfull in figuring out the problem. Please, advice With Thanks, Vivek 2008/9/11 Carsten Kutzner [EMAIL PROTECTED] vivek sharma wrote: Hi There, I am running gromacs parellal version on cluster, with different -np options. Hi, which version of gromacs exactly are you using? On analyzing the 5 nsec trajectory using ngmx, I am finding difference in the trajectory of two similar runs (only thing varying in two runs in -np i.e 20 and 64 ), where mdp file and input files are same in two cases. I am wondering why I am getting this difference in two trajectories ? I am looking for the advice whether I did something wrong or what may be the probable reason for this difference. There are many reasons why a parallel run does not yield binary identical results to a run with another number of processors, even if you start from the same tpr file. If you use PME, then the FFTW could pick a slightly different algorithm (it will select the fastest for that number of processors. This feature you can turn off by passing --disable-fftw-measure to the gromacs configure script). But still you can get results that are not binary identical if you do FFTs on a varying number of CPUs. Also, for limited accuracy which is inherent to any computer, additions need not be associative, which can show up in parallel additions. Generally, if you run in double precision, these effects will be way smaller, but nevertheless you won't get binary identical results. This will in all cases lead to trajectories which slowly diverge from each other. However, in the fist few hundred time steps, you should not see any difference in the first couple of decimals of the variables (positions, velocities, energies ...) Also, I am not able to run gromacs faster by increasing the -np issue, Please provide the exact command line you used. Is there any max limit for scaling gromacs on parellal cluster ? Yes, depending on your MD system and on the cluster you use :) Carsten With Thanks, Vivek ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Department Am Fassberg 11 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/research/dep/grubmueller/
Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories
Sorry, I forgot to mention that one cofactor NDP, I added in my molecule using PRODRG server. 2008/9/12 vivek sharma [EMAIL PROTECTED] Hi Carsten, Thanks for your reply. Actually I am running MD simulation on a protein molecule with 270 residues(2687 atoms), after adding water it is having 45599 atoms, and using the recent version of gromacs test available from gromacs.org (gmxtest-3.3.3.tgz) Following are the entries from the .mdp file I am using. **md.mdp title = trp_drg MD cpp = /lib/cpp ; location of cpp on SGI constraints = all-bonds integrator = md dt = 0.002 ; ps ! nsteps = 25000 ; total 50 ps. nstcomm = 1 nstxout = 2500 ; output coordinates every 5.0 ps nstvout = 0 nstfout = 0 nstlist = 5 ns_type = grid rlist = 0.9 coulombtype = PME rcoulomb= 0.9 rvdw= 1.4 fourierspacing = 0.12 fourier_nx= 0 fourier_ny= 0 fourier_nz= 0 pme_order = 6 ewald_rtol= 1e-5 optimize_fft = yes ; Berendsen temperature coupling is on in four groups Tcoupl= berendsen tau_t = 0.1 0.1 0.1 tc-grps = protein NDP sol ref_t = 300 300 300 ; Pressure coupling is on Pcoupl = berendsen pcoupltype = isotropic tau_p = 0.5 compressibility = 4.5e-5 ref_p = 1.0 ; Generate velocites is on at 300 K. gen_vel = yes gen_temp= 300.0 gen_seed= 173529 and Following are the commands I am using grompp_d -np 128 -f md1.mdp -c 1XU9_A_b4em.gro -p 1XU9_A.top -o 1XU9_A_md1_np128.tpr submit mdrun_d /arguement for mdrun_d -s 1XU9_A_md1_np128.tpr -o 1XU9_A_md1_np128.trr -c 1XU9_A_pmd1_np128.gro -g md_np128.log -e md_np128.edr -np 128 ***Following is the error I am getting Reading file 1XU9_A_md1_np128.tpr, VERSION 3.3.3 (double precision) starting mdrun 'CORTICOSTEROID 11-BETA-DEHYDROGENASE, ISOZYME 1' 25000 steps, 50.0 ps. srun: error: n141: task1: Segmentation fault srun: Terminating job Is this information is helpfull in figuring out the problem. Please, advice With Thanks, Vivek 2008/9/11 Carsten Kutzner [EMAIL PROTECTED] vivek sharma wrote: Hi There, I am running gromacs parellal version on cluster, with different -np options. Hi, which version of gromacs exactly are you using? On analyzing the 5 nsec trajectory using ngmx, I am finding difference in the trajectory of two similar runs (only thing varying in two runs in -np i.e 20 and 64 ), where mdp file and input files are same in two cases. I am wondering why I am getting this difference in two trajectories ? I am looking for the advice whether I did something wrong or what may be the probable reason for this difference. There are many reasons why a parallel run does not yield binary identical results to a run with another number of processors, even if you start from the same tpr file. If you use PME, then the FFTW could pick a slightly different algorithm (it will select the fastest for that number of processors. This feature you can turn off by passing --disable-fftw-measure to the gromacs configure script). But still you can get results that are not binary identical if you do FFTs on a varying number of CPUs. Also, for limited accuracy which is inherent to any computer, additions need not be associative, which can show up in parallel additions. Generally, if you run in double precision, these effects will be way smaller, but nevertheless you won't get binary identical results. This will in all cases lead to trajectories which slowly diverge from each other. However, in the fist few hundred time steps, you should not see any difference in the first couple of decimals of the variables (positions, velocities, energies ...) Also, I am not able to run gromacs faster by increasing the -np issue, Please provide the exact command line you used. Is there any max limit for scaling gromacs on parellal cluster ? Yes, depending on your MD system and on the cluster you use :) Carsten With Thanks, Vivek ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry
[gmx-users] question regarding genion
Hi everybody, I want to place my counterions close to the charged amino acids. I am aware of the fact that even if the counterions are randomly added yet they will eventually settle during equlibration. But still I will like to start with a structure that has counterions close to charged residues. I have been going through the archive and the manual but I could not figure out how to disable the default option of -random while using the genion command. Can anyone guide me through the command that is required to add counterions based on potential so that the counterions are closed to the charged residues? Thanks in advance Sarbani ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories
Hi Vivek, I think I'm a bit lost now. We were originally talking about differences in trajectories but from the mail you just sent I can see that you have a segmentation fault, which is another problem. I can only suggest that if you want to make use of 128 processors you should download the CVS version of gromacs or wait until the 4.0 is out. Since in gromacs 3.3.x the protein has to reside as a whole on one of the processors, this very likely limits your scaling. Also, on 128 processors you will get a PME grid of 128x128xSomething (since nx and ny have to be divisible by the number of CPUs) which is probably way bigger than it needs to be (how big is it for a single CPU?). Together with a PME order of 6 this leads to a large overlap in the charge grid, which has to be communicated among the processors. PME order 4 will be better suited for such a high parallelization, but in general for Gromacs 3.x you should have at least a few thousand atoms per processor, less than 1000 won't give you decent scaling at all. Carsten vivek sharma wrote: Hi Carsten, Thanks for your reply. Actually I am running MD simulation on a protein molecule with 270 residues(2687 atoms), after adding water it is having 45599 atoms, and using the recent version of gromacs test available from gromacs.org http://gromacs.org (gmxtest-3.3.3.tgz) Following are the entries from the .mdp file I am using. **md.mdp title = trp_drg MD cpp = /lib/cpp ; location of cpp on SGI constraints = all-bonds integrator = md dt = 0.002 ; ps ! nsteps = 25000 ; total 50 ps. nstcomm = 1 nstxout = 2500 ; output coordinates every 5.0 ps nstvout = 0 nstfout = 0 nstlist = 5 ns_type = grid rlist = 0.9 coulombtype = PME rcoulomb= 0.9 rvdw= 1.4 fourierspacing = 0.12 fourier_nx= 0 fourier_ny= 0 fourier_nz= 0 pme_order = 6 ewald_rtol= 1e-5 optimize_fft = yes ; Berendsen temperature coupling is on in four groups Tcoupl= berendsen tau_t = 0.1 0.1 0.1 tc-grps = protein NDP sol ref_t = 300 300 300 ; Pressure coupling is on Pcoupl = berendsen pcoupltype = isotropic tau_p = 0.5 compressibility = 4.5e-5 ref_p = 1.0 ; Generate velocites is on at 300 K. gen_vel = yes gen_temp= 300.0 gen_seed= 173529 and Following are the commands I am using grompp_d -np 128 -f md1.mdp -c 1XU9_A_b4em.gro -p 1XU9_A.top -o 1XU9_A_md1_np128.tpr submit mdrun_d /arguement for mdrun_d -s 1XU9_A_md1_np128.tpr -o 1XU9_A_md1_np128.trr -c 1XU9_A_pmd1_np128.gro -g md_np128.log -e md_np128.edr -np 128 ***Following is the error I am getting Reading file 1XU9_A_md1_np128.tpr, VERSION 3.3.3 (double precision) starting mdrun 'CORTICOSTEROID 11-BETA-DEHYDROGENASE, ISOZYME 1' 25000 steps, 50.0 ps. srun: error: n141: task1: Segmentation fault srun: Terminating job Is this information is helpfull in figuring out the problem. Please, advice With Thanks, Vivek 2008/9/11 Carsten Kutzner [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] vivek sharma wrote: Hi There, I am running gromacs parellal version on cluster, with different -np options. Hi, which version of gromacs exactly are you using? On analyzing the 5 nsec trajectory using ngmx, I am finding difference in the trajectory of two similar runs (only thing varying in two runs in -np i.e 20 and 64 ), where mdp file and input files are same in two cases. I am wondering why I am getting this difference in two trajectories ? I am looking for the advice whether I did something wrong or what may be the probable reason for this difference. There are many reasons why a parallel run does not yield binary identical results to a run with another number of processors, even if you start from the same tpr file. If you use PME, then the FFTW could pick a slightly different algorithm (it will select the fastest for that number of processors. This feature you can turn off by passing --disable-fftw-measure to the gromacs configure script). But still you can get results that are not binary identical if you do FFTs on a varying number of CPUs. Also, for limited accuracy which is inherent to any computer, additions need not be associative, which can show up in parallel additions. Generally, if you run in double precision, these effects will be way smaller, but nevertheless you won't get binary identical results. This will in all cases lead to trajectories
Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories
HI Carsten, Thanks again for reply. and my apologies for putting question out of discussion. actually I tried same command with -np 24 and -np 64, and for both cases i got different trajectory (while analyzing them using ngmx). Also Can you suggest me some tutorial or reference to get details of scalability limitation of gromacs(on parellal enviournment). With Thanks, Vivek 2008/9/12 Carsten Kutzner [EMAIL PROTECTED] Hi Vivek, I think I'm a bit lost now. We were originally talking about differences in trajectories but from the mail you just sent I can see that you have a segmentation fault, which is another problem. I can only suggest that if you want to make use of 128 processors you should download the CVS version of gromacs or wait until the 4.0 is out. Since in gromacs 3.3.x the protein has to reside as a whole on one of the processors, this very likely limits your scaling. Also, on 128 processors you will get a PME grid of 128x128xSomething (since nx and ny have to be divisible by the number of CPUs) which is probably way bigger than it needs to be (how big is it for a single CPU?). Together with a PME order of 6 this leads to a large overlap in the charge grid, which has to be communicated among the processors. PME order 4 will be better suited for such a high parallelization, but in general for Gromacs 3.x you should have at least a few thousand atoms per processor, less than 1000 won't give you decent scaling at all. Carsten vivek sharma wrote: Hi Carsten, Thanks for your reply. Actually I am running MD simulation on a protein molecule with 270 residues(2687 atoms), after adding water it is having 45599 atoms, and using the recent version of gromacs test available from gromacs.org http://gromacs.org (gmxtest-3.3.3.tgz) Following are the entries from the .mdp file I am using. **md.mdp title = trp_drg MD cpp = /lib/cpp ; location of cpp on SGI constraints = all-bonds integrator = md dt = 0.002 ; ps ! nsteps = 25000 ; total 50 ps. nstcomm = 1 nstxout = 2500 ; output coordinates every 5.0 ps nstvout = 0 nstfout = 0 nstlist = 5 ns_type = grid rlist = 0.9 coulombtype = PME rcoulomb= 0.9 rvdw= 1.4 fourierspacing = 0.12 fourier_nx= 0 fourier_ny= 0 fourier_nz= 0 pme_order = 6 ewald_rtol= 1e-5 optimize_fft = yes ; Berendsen temperature coupling is on in four groups Tcoupl= berendsen tau_t = 0.1 0.1 0.1 tc-grps = protein NDP sol ref_t = 300 300 300 ; Pressure coupling is on Pcoupl = berendsen pcoupltype = isotropic tau_p = 0.5 compressibility = 4.5e-5 ref_p = 1.0 ; Generate velocites is on at 300 K. gen_vel = yes gen_temp= 300.0 gen_seed= 173529 and Following are the commands I am using grompp_d -np 128 -f md1.mdp -c 1XU9_A_b4em.gro -p 1XU9_A.top -o 1XU9_A_md1_np128.tpr submit mdrun_d /arguement for mdrun_d -s 1XU9_A_md1_np128.tpr -o 1XU9_A_md1_np128.trr -c 1XU9_A_pmd1_np128.gro -g md_np128.log -e md_np128.edr -np 128 ***Following is the error I am getting Reading file 1XU9_A_md1_np128.tpr, VERSION 3.3.3 (double precision) starting mdrun 'CORTICOSTEROID 11-BETA-DEHYDROGENASE, ISOZYME 1' 25000 steps, 50.0 ps. srun: error: n141: task1: Segmentation fault srun: Terminating job Is this information is helpfull in figuring out the problem. Please, advice With Thanks, Vivek 2008/9/11 Carsten Kutzner [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] vivek sharma wrote: Hi There, I am running gromacs parellal version on cluster, with different -np options. Hi, which version of gromacs exactly are you using? On analyzing the 5 nsec trajectory using ngmx, I am finding difference in the trajectory of two similar runs (only thing varying in two runs in -np i.e 20 and 64 ), where mdp file and input files are same in two cases. I am wondering why I am getting this difference in two trajectories ? I am looking for the advice whether I did something wrong or what may be the probable reason for this difference. There are many reasons why a parallel run does not yield binary identical results to a run with another number of processors, even if you start from the same tpr file. If you use PME, then the FFTW could pick a slightly different algorithm (it will select the fastest for that number of processors. This feature you can turn off by passing
Re: [gmx-users] Gromacs parellal run:: getting difference in two trajectories
vivek sharma wrote: HI Carsten, Thanks again for reply. and my apologies for putting question out of discussion. actually I tried same command with -np 24 and -np 64, and for both cases i got different trajectory (while analyzing them using ngmx). If you look at a plot of your data, e.g. energies, they should slowly diverge with time (start by looking at the first few hundred time steps). This behaviour I would expect to be ok. Long-term averages should not be affected, while the variables at a certain point in time will be completely uncorrelated after a while. Also Can you suggest me some tutorial or reference to get details of scalability limitation of gromacs(on parellal enviournment). There is a paper about gromacs scalability on Ethernet from which you can draw some conclustions about the 3.3.x version. For higher processor counts (np 32) check out the new gromacs 4.0 paper. - Speeding up parallel GROMACS on high-latency networks, 2007, JCC, Vol 28, 12 - GROMACS 4: Algorithms for Highly Efficient, Load-Balanced, and Scalable Molecular Simulation, 2008, JCTC 4 (3) Hope that helps, Carsten With Thanks, Vivek 2008/9/12 Carsten Kutzner [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] Hi Vivek, I think I'm a bit lost now. We were originally talking about differences in trajectories but from the mail you just sent I can see that you have a segmentation fault, which is another problem. I can only suggest that if you want to make use of 128 processors you should download the CVS version of gromacs or wait until the 4.0 is out. Since in gromacs 3.3.x the protein has to reside as a whole on one of the processors, this very likely limits your scaling. Also, on 128 processors you will get a PME grid of 128x128xSomething (since nx and ny have to be divisible by the number of CPUs) which is probably way bigger than it needs to be (how big is it for a single CPU?). Together with a PME order of 6 this leads to a large overlap in the charge grid, which has to be communicated among the processors. PME order 4 will be better suited for such a high parallelization, but in general for Gromacs 3.x you should have at least a few thousand atoms per processor, less than 1000 won't give you decent scaling at all. Carsten vivek sharma wrote: Hi Carsten, Thanks for your reply. Actually I am running MD simulation on a protein molecule with 270 residues(2687 atoms), after adding water it is having 45599 atoms, and using the recent version of gromacs test available from gromacs.org http://gromacs.org http://gromacs.org (gmxtest-3.3.3.tgz) Following are the entries from the .mdp file I am using. **md.mdp title = trp_drg MD cpp = /lib/cpp ; location of cpp on SGI constraints = all-bonds integrator = md dt = 0.002 ; ps ! nsteps = 25000 ; total 50 ps. nstcomm = 1 nstxout = 2500 ; output coordinates every 5.0 ps nstvout = 0 nstfout = 0 nstlist = 5 ns_type = grid rlist = 0.9 coulombtype = PME rcoulomb= 0.9 rvdw= 1.4 fourierspacing = 0.12 fourier_nx= 0 fourier_ny= 0 fourier_nz= 0 pme_order = 6 ewald_rtol= 1e-5 optimize_fft = yes ; Berendsen temperature coupling is on in four groups Tcoupl= berendsen tau_t = 0.1 0.1 0.1 tc-grps = protein NDP sol ref_t = 300 300 300 ; Pressure coupling is on Pcoupl = berendsen pcoupltype = isotropic tau_p = 0.5 compressibility = 4.5e-5 ref_p = 1.0 ; Generate velocites is on at 300 K. gen_vel = yes gen_temp= 300.0 gen_seed= 173529 and Following are the commands I am using grompp_d -np 128 -f md1.mdp -c 1XU9_A_b4em.gro -p 1XU9_A.top -o 1XU9_A_md1_np128.tpr submit mdrun_d /arguement for mdrun_d -s 1XU9_A_md1_np128.tpr -o 1XU9_A_md1_np128.trr -c 1XU9_A_pmd1_np128.gro -g md_np128.log -e md_np128.edr -np 128 ***Following is the error I am getting Reading file 1XU9_A_md1_np128.tpr, VERSION 3.3.3 (double precision) starting mdrun 'CORTICOSTEROID 11-BETA-DEHYDROGENASE, ISOZYME 1' 25000 steps, 50.0 ps. srun: error: n141: task1: Segmentation fault srun: Terminating job
Re: [gmx-users] question regarding genion
Hi Sarbani, I think you should use genion -random no along with other options. Thanks, Vignesh On Fri, Sep 12, 2008 at 3:30 PM, sarbani chattopadhyay [EMAIL PROTECTED] wrote: Hi everybody, I want to place my counterions close to the charged amino acids. I am aware of the fact that even if the counterions are randomly added yet they will eventually settle during equlibration. But still I will like to start with a structure that has counterions close to charged residues. I have been going through the archive and the manual but I could not figure out how to disable the default option of -random while using the genion command. Can anyone guide me through the command that is required to add counterions based on potential so that the counterions are closed to the charged residues? Thanks in advance Sarbani [image: Ebay]http://adworks.rediff.com/cgi-bin/AdWorks/click.cgi/www.rediff.com/signature-default.htm/[EMAIL PROTECTED]/2401775_2394076/2397136/1?PARTNER=3OAS_QUERY=null ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- R.Vigneshwar Graduate Student, Dept. of Chemical Biomolecular Engg, National University of Singapore, Singapore Strive for Excellence, Never be satisfied with the second Best!! The rewards of sincere resolves are highs money can never buy! ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] extract protein and counter ions from trajectory
sarbani chattopadhyay wrote: Hi everybody, I want to know is there any way to extract the coordinates of both the protein and counterions from the trajectory while writing the pdb file ie. I don't want the water molecules but only the protein and the counter ions. I did not see that option wile trying to generate the pdb files. Is there any way of doing this? As with every Gromacs analysis (and processing) tool, you can create an index file that contains whatever combination of groups you want. -Justin Thanks in advance Sarbani Ebay http://adworks.rediff.com/cgi-bin/AdWorks/click.cgi/www.rediff.com/signature-default.htm/[EMAIL PROTECTED]/2401775_2394076/2397136/1?PARTNER=3OAS_QUERY=null ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Justin A. Lemkul Graduate Research Assistant Department of Biochemistry Virginia Tech Blacksburg, VA jalemkul[at]vt.edu | (540) 231-9080 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] extract protein and counter ions from trajectory
Am Freitag, 12. September 2008 schrieb sarbani chattopadhyay: I want to know is there any way to extract the coordinates of both the protein and counterions from the trajectory while writing the pdb file ie. I don't want the water molecules but only the protein and the counter ions. I did not see that option wile trying to generate the pdb files. Is there any way of doing this? You can create an index file, containing a union of both groups and then provide this to trjconv. Best Martin ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] the FF parameter sets
Chih-Ying Lin wrote: Hi I have to ask the same question again. #define ga_24 120 505 ; H - N - CH3. H, HC-6-ring, H-NT-CHn 90 I confuse with the comma , and - please read my previous answer. does it mean ? =H - N - CH3 ( the angle between H - N - CH3, right?) H - N - H, H - N - (HC-6-ring) H - N - CHn 90 (HC-6-ring) - NT -CHn CH3 - NT - CHn 90 Thank you Lin ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- David van der Spoel, Ph.D., Professor of Biology Molec. Biophys. group, Dept. of Cell Molec. Biol., Uppsala University. Box 596, 75124 Uppsala, Sweden. Phone: +46184714205. Fax: +4618511755. [EMAIL PROTECTED] [EMAIL PROTECTED] http://folding.bmc.uu.se ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
[gmx-users] system couldn't be run
Dear I make the solution box of organic solvent. It is homeogen and mixed properly. It also has a minimum energy. I put one protein in this solution. Before put the protein in the solution I minimimized it also. Now I mixed 2 system which each of them are minimized. After put the protein in the solution I try first to minimized it then run it but unfortunately it crashed. I did not work at all. I also tried to run the system without minimization but it also did not work at all. I changed mdp file ( eg,remove center of mass motion or decrease the temprature,..) but non of them did work. Thats very nice of you if you guid me what should i do? Thanks Morteza ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] to add counter ions based on potential
sarbani chattopadhyay wrote: Hi everybody, I want to add counter ions based on potential and not randomly. I will be very grateful if anyone can guide me through the command genion for this. I tried with -random no but it didn't work. According to genion -h, you should use genion -norandom -Justin Thanks in advance Sarbani IN http://adworks.rediff.com/cgi-bin/AdWorks/click.cgi/www.rediff.com/signature-home.htm/[EMAIL PROTECTED]/2625139_2617412/2620527/1?PARTNER=3OAS_QUERY=null ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Justin A. Lemkul Graduate Research Assistant Department of Biochemistry Virginia Tech Blacksburg, VA jalemkul[at]vt.edu | (540) 231-9080 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] to add counter ions based on potential
sarbani chattopadhyay wrote: Hi everybody, I want to add counter ions based on potential and not randomly. I will be very grateful if anyone can guide me through the command genion for this. As far as I know, that has not been implemented yet. It is a good idea, though. Jochen I tried with -random no but it didn't work. Thanks in advance Sarbani ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Dr. Jochen Hub Max Planck Institute for Biophysical Chemistry Computational biomolecular dynamics group Am Fassberg 11 D-37077 Goettingen, Germany Email: jhub[at]gwdg.de Tel.: +49 (0)551 201-2312 ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
[gmx-users] the FF parameter sets
Hi I read your comment. This is a comment line to the previous one. The angle is between the three atoms mentioned. it is an index in another array in the gROMOS96 files (not part of gromacs) . But, I did not understand.. #define ga_24 120 505 ; H - N - CH3. H, HC-6-ring, H-NT-CHn 90 I still confuse with the comma , and - does it mean ? =H - N - CH3 ( the angle between H - N - CH3, right?) H - N - H, H - N - (HC-6-ring) H - N - CHn (HC-6-ring) - NT -CHn CH3 - NT - CHn thank you Lin ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] the FF parameter sets
Chih-Ying Lin wrote: does it mean ? =H - N - CH3 ( the angle between H - N - CH3, right?) H - N - H, H - N - (HC-6-ring) H - N - CHn (HC-6-ring) - NT -CHn CH3 - NT - CHn The angles you have listed are correct. -Justin thank you Lin ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- Justin A. Lemkul Graduate Research Assistant Department of Biochemistry Virginia Tech Blacksburg, VA jalemkul[at]vt.edu | (540) 231-9080 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] to add counter ions based on potential
In message [EMAIL PROTECTED] (on 12 September 2008 15:25:38 +0200), [EMAIL PROTECTED] (Jochen Hub) wrote: sarbani chattopadhyay wrote: Hi everybody, I want to add counter ions based on potential and not randomly. I will be very grateful if anyone can guide me through the command genion for this. As far as I know, that has not been implemented yet. It is a good idea, though. I've written a perl program to do this for a limited case (addition of +1 ions only, just with respect to proteins and NADPH (NDP), based on whole charges); see http://cesario.rutgers.edu/easmith/research/ (under the perl programs) for check.water.for.pos.ions.pl. This program actually takes a PDB file with waters added - see add.water.for.ions.pl for one program with an example in it for how to do this; the editconf mention in it is because it's preferable to do this with a centered protein. It then tells you the best places to replace oxygens from water with positive ions; you can then translate the PDB file back into gromacs format (ideally after adding hydrogens again with something like the Richardsons' reduce program). -Allen -- Allen Smith, Ph.D.http://cesario.rutgers.edu/easmith/ February 1, 2003 Space Shuttle Columbia Ad Astra Per Aspera To The Stars Through Asperity ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
[gmx-users] doubt about the oplsaa topology
Hello all, I am trying to build the topology for an oligosaccharide to be used, together with a peptide, in a simulation with oplsaa (apparently, from the discussions in the last days, this should be the best choice...). Now, I had a .top from prodrg that I previously modified and used, and I was starting rewriting on it the new .top for the oplsaa. After renaming all the atoms according to the ffoplsaa.atp, ( i will figure out later how to include the missing non polar Hs), i wanted to check that all the bonds in this molecule were described in the ffoplsaabon.itp (i have sulfate groups) BUT, when I opened the ffoplsaabon.itp, I noticed that the atom names don't correspond to the types described in the .atp (opls_n), but more likely to the names in any other gro force field. I'm confused... anybody can explain me this contradiction? thanks a lot... great day to all... serena -- Serena Leone, Ph.D. Brigham and Women's Hospital Harvard Medical School Channing Laboratory EBRC 609 221 Longwood Avenue Boston, MA 02115 (tel) 617-732-8586 The information transmitted in this electronic communication is intended only for the person or entity to whom it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of or taking of any action in reliance upon this information by persons or entities other than the intended recipient is prohibited. If you received this information in error, please contact the Compliance HelpLine at 800-856-1983 and properly dispose of this information. ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
[gmx-users] g_rotacf -normalize
Dear users, The default -normalize option in g_rotacf is set to yes. Does it mean that the vector of interest is always be a unit vector ? If so, this has to be set to -nonormalize I suppose if one is interested about the internal dynamics of a system. Please let me know whether I am correct or not. Ram. ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
Re: [gmx-users] doubt about the oplsaa topology
Check out ffoplsaanb.itp to see which name (opls_XXX) corresponds to which bond type (ie. what you are seeing in ffoplsaabon.itp) Tom --On Friday, September 12, 2008 17:41:48 -0400 Serena Leone [EMAIL PROTECTED] wrote: Hello all, I am trying to build the topology for an oligosaccharide to be used, together with a peptide, in a simulation with oplsaa (apparently, from the discussions in the last days, this should be the best choice...). Now, I had a .top from prodrg that I previously modified and used, and I was starting rewriting on it the new .top for the oplsaa. After renaming all the atoms according to the ffoplsaa.atp, ( i will figure out later how to include the missing non polar Hs), i wanted to check that all the bonds in this molecule were described in the ffoplsaabon.itp (i have sulfate groups) BUT, when I opened the ffoplsaabon.itp, I noticed that the atom names don't correspond to the types described in the .atp (opls_n), but more likely to the names in any other gro force field. I'm confused... anybody can explain me this contradiction? thanks a lot... great day to all... serena -- Serena Leone, Ph.D. Brigham and Women's Hospital Harvard Medical School Channing Laboratory EBRC 609 221 Longwood Avenue Boston, MA 02115 (tel) 617-732-8586 The information transmitted in this electronic communication is intended only for the person or entity to whom it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of or taking of any action in reliance upon this information by persons or entities other than the intended recipient is prohibited. If you received this information in error, please contact the Compliance HelpLine at 800-856-1983 and properly dispose of this information. ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php -- TJ Piggot [EMAIL PROTECTED] University of Bristol, UK. ___ gmx-users mailing listgmx-users@gromacs.org http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [EMAIL PROTECTED] Can't post? Read http://www.gromacs.org/mailing_lists/users.php
[tools-issues] [Issue 88888] OOo 3.0 release stoppers
To comment on the following update, log in, then open the issue: http://www.openoffice.org/issues/show_bug.cgi?id=8 This issue depends on issue 93652, which changed state: What|Old value |New value Status|NEW |RESOLVED Resolution| |INVALID - Please do not reply to this automatically generated notification from Issue Tracker. Please log onto the website and enter your comments. http://qa.openoffice.org/issue_handling/project_issues.html#notification - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]