Nonono, you don't get it, but I suppose only academics should try to
do a proper universal theory application attempt, I won't respond to
this anymore. I do suggest that if you'd take your own impulses and
encode them with you own algorithms you would find less interesting and
far less poetic
On 15/07/2015, Ethan Duni ethan.d...@gmail.com wrote:
Right, this is an artifact of the approximation you're doing. The model
doesn't explicitly understand periodicity, but instead only looks for
transitions, so the more transitions per second (higher frequency) the more
it has to do.
Yes. So
On 16/07/2015, Peter S peter.schoffhau...@gmail.com wrote:
The above is a histogram based entropy estimator. Another method of
estimating entropy is to build a predictor that tries to predict the
signal from the preceding samples. When compressing waveforms, audio
codecs typically do that.
On 16/07/2015, Theo Verelst theo...@theover.org wrote:
Nonono, you don't get it, but I suppose only academics should try to
do a proper universal theory application attempt, I won't respond to
this anymore. I do suggest that if you'd take your own impulses and
encode them with you own
On 16/07/2015, Peter S peter.schoffhau...@gmail.com wrote:
Quantization, interpolation and other numerical errors
will add a slight uncertainity to your entropy estimate; in practice,
things are very rarely exact.
Another view at looking at this - for any real-world signal, there is
typically
On Mon, Jul 13, 2015 at 8:39 AM, Charles Z Henry czhe...@gmail.com wrote:
On Mon, Jul 13, 2015 at 3:28 AM, Vadim Zavalishin
vadim.zavalis...@native-instruments.de wrote:
On 10-Jul-15 19:50, Charles Z Henry wrote:
The more general conjecture for the math heads :
If u is the solution of a
On 16/07/2015, Ethan Duni ethan.d...@gmail.com wrote:
if a
signal can be described by a finite set of parameters (amplitude, phase and
frequency, say) then it immediately follows that it has zero entropy rate.
Let's imagine you want to transmit a square wave with amplitude=100,
phase=30,
his model will be baffled as soon as you send something into it that
is not harmonic. So it is only ideal in the very simple case of a
single, periodic, harmonic waveform, which is just a small subset of
arbitrary signals.
I'm not suggesting using a parametric signal model as an estimator. I'm
On 16/07/2015, Ethan Duni ethan.d...@gmail.com wrote:
But, it seems that it does *not* approach zero. If you fed an arbitrarily
long periodic waveform into this estimator, you won't see the estimate
approaching zero as you increase the length.
False. The better estimators give an estimate that
This algorithm gives an entropy rate estimate approaching zero for any
periodic waveform, irregardless of the shape (assuming the analysis
window is large enough).
But, it seems that it does *not* approach zero. If you fed an arbitrarily
long periodic waveform into this estimator, you won't see
On 17/07/2015, Ethan Duni ethan.d...@gmail.com wrote:
What are these better estimators? It seems that you have several estimators
in mind but I can't keep track of what they all are,
I urge you to slow down, collect your thoughts, and
spend a bit more time editing your posts for clarity (and
On 17/07/2015, robert bristow-johnson r...@audioimagination.com wrote:
in your model, is one sample (from the DSP semantic) the same as a
message (from the Information Theory semantic)?
A message can be anything - it can be a sample, a bit, a combination
of samples or bits, a set of parameters
On 7/17/15 12:08 AM, Peter S wrote:
On 17/07/2015, Peter Speter.schoffhau...@gmail.com wrote:
Think of it as this - if your receiver can distinguish only two
different sets of parameters, then you need to send at least *one* bit
to distinguish between them - '0' meaning square wave A, and '1'
On 17/07/2015, Peter S peter.schoffhau...@gmail.com wrote:
Think of it as this - if your receiver can distinguish only two
different sets of parameters, then you need to send at least *one* bit
to distinguish between them - '0' meaning square wave A, and '1'
meaning square wave B. Without
If you assume the entropy _rate_ to be the average entropy per bits
(although there are other possible definitions), then - if the total
entropy of a waveform is finite, then as the number of observations
approaches infinity, the entropy rate will approach zero, as
finite/infinite = 0.
This does
False. The better estimators give an estimate that approaches zero.
% set pattern [randbits [randnum 20]]; puts pattern=$pattern; for {set i
1} {$i
=10} {incr i} {put L=$i, ; measure [repeat $pattern 1] $i}
pattern=1000110011011010
L=1, Estimated entropy per bit: 1.00
L=2, Estimated
16 matches
Mail list logo