Re: [music-dsp] about entropy encoding

2015-07-16 Thread Theo Verelst
Nonono, you don't get it, but I suppose only academics should try to do a proper universal theory application attempt, I won't respond to this anymore. I do suggest that if you'd take your own impulses and encode them with you own algorithms you would find less interesting and far less poetic

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 15/07/2015, Ethan Duni ethan.d...@gmail.com wrote: Right, this is an artifact of the approximation you're doing. The model doesn't explicitly understand periodicity, but instead only looks for transitions, so the more transitions per second (higher frequency) the more it has to do. Yes. So

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 16/07/2015, Peter S peter.schoffhau...@gmail.com wrote: The above is a histogram based entropy estimator. Another method of estimating entropy is to build a predictor that tries to predict the signal from the preceding samples. When compressing waveforms, audio codecs typically do that.

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 16/07/2015, Theo Verelst theo...@theover.org wrote: Nonono, you don't get it, but I suppose only academics should try to do a proper universal theory application attempt, I won't respond to this anymore. I do suggest that if you'd take your own impulses and encode them with you own

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 16/07/2015, Peter S peter.schoffhau...@gmail.com wrote: Quantization, interpolation and other numerical errors will add a slight uncertainity to your entropy estimate; in practice, things are very rarely exact. Another view at looking at this - for any real-world signal, there is typically

Re: [music-dsp] Sampling theorem extension

2015-07-16 Thread Charles Z Henry
On Mon, Jul 13, 2015 at 8:39 AM, Charles Z Henry czhe...@gmail.com wrote: On Mon, Jul 13, 2015 at 3:28 AM, Vadim Zavalishin vadim.zavalis...@native-instruments.de wrote: On 10-Jul-15 19:50, Charles Z Henry wrote: The more general conjecture for the math heads : If u is the solution of a

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 16/07/2015, Ethan Duni ethan.d...@gmail.com wrote: if a signal can be described by a finite set of parameters (amplitude, phase and frequency, say) then it immediately follows that it has zero entropy rate. Let's imagine you want to transmit a square wave with amplitude=100, phase=30,

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Ethan Duni
his model will be baffled as soon as you send something into it that is not harmonic. So it is only ideal in the very simple case of a single, periodic, harmonic waveform, which is just a small subset of arbitrary signals. I'm not suggesting using a parametric signal model as an estimator. I'm

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 16/07/2015, Ethan Duni ethan.d...@gmail.com wrote: But, it seems that it does *not* approach zero. If you fed an arbitrarily long periodic waveform into this estimator, you won't see the estimate approaching zero as you increase the length. False. The better estimators give an estimate that

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Ethan Duni
This algorithm gives an entropy rate estimate approaching zero for any periodic waveform, irregardless of the shape (assuming the analysis window is large enough). But, it seems that it does *not* approach zero. If you fed an arbitrarily long periodic waveform into this estimator, you won't see

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 17/07/2015, Ethan Duni ethan.d...@gmail.com wrote: What are these better estimators? It seems that you have several estimators in mind but I can't keep track of what they all are, I urge you to slow down, collect your thoughts, and spend a bit more time editing your posts for clarity (and

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 17/07/2015, robert bristow-johnson r...@audioimagination.com wrote: in your model, is one sample (from the DSP semantic) the same as a message (from the Information Theory semantic)? A message can be anything - it can be a sample, a bit, a combination of samples or bits, a set of parameters

Re: [music-dsp] about entropy encoding

2015-07-16 Thread robert bristow-johnson
On 7/17/15 12:08 AM, Peter S wrote: On 17/07/2015, Peter Speter.schoffhau...@gmail.com wrote: Think of it as this - if your receiver can distinguish only two different sets of parameters, then you need to send at least *one* bit to distinguish between them - '0' meaning square wave A, and '1'

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
On 17/07/2015, Peter S peter.schoffhau...@gmail.com wrote: Think of it as this - if your receiver can distinguish only two different sets of parameters, then you need to send at least *one* bit to distinguish between them - '0' meaning square wave A, and '1' meaning square wave B. Without

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Peter S
If you assume the entropy _rate_ to be the average entropy per bits (although there are other possible definitions), then - if the total entropy of a waveform is finite, then as the number of observations approaches infinity, the entropy rate will approach zero, as finite/infinite = 0. This does

Re: [music-dsp] about entropy encoding

2015-07-16 Thread Ethan Duni
False. The better estimators give an estimate that approaches zero. % set pattern [randbits [randnum 20]]; puts pattern=$pattern; for {set i 1} {$i =10} {incr i} {put L=$i, ; measure [repeat $pattern 1] $i} pattern=1000110011011010 L=1, Estimated entropy per bit: 1.00 L=2, Estimated