Hi Dear Gromacs users,

I would like to calculate the standard deviation (as the error bar) for
dV/dlambda.xvg file. I used g_analyze command as the following:



g_analyze   -f    free0.9.xvg  -av  average_0.9

I got:

set       average                *standard  deviation*        *std. dev.  /
sqrt(n-1)    *…

SS1     6.053822e+01       3.062230e+01                  1.936724e-02
  …

Is the amount of in third (standard deviation) or fourth column (std. dev.  /
sqrt(n-1) ) better than to use as the error bar?

I want to draw dG/d lambda via lambda and show error bar for free energy
differences.



Thanks in advance

Afsaneh
--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to