On 5/20/06, Tom Lane <[EMAIL PROTECTED]> wrote:
"Brendan Jurd" <[EMAIL PROTECTED]> writes:
> I noticed a peculiarity in the default postgres aggregate functions.  min()=
> ,
> max() and avg() support interval as an input type, but stddev() and
> variance() do not.

> Is there a rationale behind this, or is it just something that was never
> implemented?

Is it sensible to calculate standard deviation on intervals?  How would
you handle the multiple components?  I mean, you could certainly define
*something*, but how sane/useful would the result be?

Strictly speaking there's nothing bad in intervals. Physically
standart deviation on interval can be very useful without any doubts.
I can make a lot of examples on this. Say you want to know stat
parameters of semi-regular periodical process (avg distance in time
between maximums of some value and stddev of this quasiperiod -- why
not?).

Regards,
Ivan Zolotukhin

---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?

              http://archives.postgresql.org

Reply via email to