On Mon, Oct 21, 2013 at 4:29 PM, Tom Lane <t...@sss.pgh.pa.us> wrote: > Hm. It's been a long time since college statistics, but doesn't the > entire concept of standard deviation depend on the assumption that the > underlying distribution is more-or-less normal (Gaussian)?
I don't see how. The standard deviation here would be expressed in units of milliseconds. Now, that could be misleading, in that like a mean average, it might "mischaracterize" the distribution. But it's still got to be a big improvement. I like the idea of a decay, but can't think of a principled scheme offhand. -- Peter Geoghegan -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers