I tested a simple, first-order histogram-based entropy estimate idea
on various 8-bit signed waveforms (message=sample, no correlations
analyzed). Only trivial (non-bandlimited) waveforms were analyzed.

Method:
1) Signal is trivially turned into a histogram.
2) Probabilities assumed based on histogram frequencies.
3) Entropy rate of the probability distribution is calculated.

Detailed results:
-----------------
http://morpheus.spectralhead.com/entropy/histogram/

Observations:
-------------
White noise:
- as window length increases, estimate converges to 1.
- as amplitude decreases, estimate converges to ~0.2.

Square wave:
- estimate is about 0.125, irregardless of frequency or amplitude.

Saw wave:
- as frequency increases, converges to ~0.2.
- as frequency decreases, converges to 1.

Sine wave:
- as frequency increases, converges to ~0.17.
- as frequency decreases, converges to ~0.95.
- as amplitude decreases, converges to ~0.2

Constant signal:
- estimate is always zero.

Conclusions:
------------
As expected, being first-order, overall a poor estimator. For some
corner cases (white noise, constant signal), gives the correct result.
For other, periodic signals, gives an estimate with varying amounts of
error. Since it doesn't analyze correlations, it cannot distinguish
low frequency saw wave from white noise. Gives a rather poor estimate
for most low frequency periodic waveforms, does somewhat better for
high frequencies.

-P
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to