>Let's assume I have a sinusoidal signal.
>Let's assume I amplify it to 10x.
>Where does new entropy come from?

It comes from the amplification.

>Look carefully - I'm not speaking about creating _another_ sine wave
>with 10x volume. No.
>I'm saying that I amplify the _original_ sine wave by 10x

Those kinds of philosophical distinctions do not have any bearing on
entropy.

>So you have a big, loud, _quantized_ sine wave. Not a smooth one like that
would
>come if you synthesized _another_, new sine wave with 10x volume, but
>rather the original sine wave, amplified to 10x.

So you aren't talking about literal "sine waves" then, you're talking about
finite-word-length approximations to sine waves. For an actual sine wave,
there is no material distinction between amplifying it by 10x and
generating a "new" one at 10x amplitude. They're indistinguishable.

In the case of finite-word-length approximations of sine waves, it is true
that amplification doesn't increase entropy. This is because the
amplification also increases the quantization noise floor by the same
margin as the signal, so the final SNR (which is what determines the
entropy) stays the same.

But that's a feature of finite-word-length effects (this is called the data
processing inequality), and not of the amplification of sinusoids as such.
If you'd simply sampled a sinusoid with higher amplitude instead, you'd
have ended up with a higher SNR and so a higher entropy.

We need to be careful to distinguish between entropy of continuous-valued
signals and manipulations thereof (differential entropy is relevant here),
and entropy of quantized signals (and manipulations thereof). It is indeed
true that you can't increase the entropy of a digital signal by performing
deterministic digital processing on it. But that is not true of
continuous-valued signals and operations on them. Amplification is an
example of this difference. So we need to be very careful to specify which
one we are talking about at each point, and how they relate to one another
when sampling/reconstructing.

>Correction: 1 bit PER SAMPLE (either 1 or 0, hi or low - a naive
>square only has twose two states...)

The quantity of (asymptotic) entropy per sample is called the entropy rate.
The entropy rate of a deterministic periodic signal, like a square wave or
sine wave, is actually *zero*. Intuitively, the entropy rate corresponds to
the amount of "surprise" in a given sample, after having observed many
previous samples. A deterministic, periodic signal quickly stops being
"surprising" after observing a few periods, so the entropy rate goes to
zero. Conversely, the entropy rate of iid noise signals equals the entropy
of any individual sample - since there is no dependence on previous
samples, there is a constant amount of "surprise" in each sample,
corresponding to the entropy of the distribution the noise is drawn from.

E



On Thu, Oct 9, 2014 at 10:00 AM, Peter S <peter.schoffhau...@gmail.com>
wrote:

> On 09/10/2014, Ethan Duni <ethan.d...@gmail.com> wrote:
> > You need way more than 1 bit to represent any square wave
>
> Correction: 1 bit PER SAMPLE (either 1 or 0, hi or low - a naive
> square only has twose two states...)
> (I thought that was trivial that I meant that)
> --
> dupswapdrop -- the music-dsp mailing list and website:
> subscription info, FAQ, source code archive, list archive, book reviews,
> dsp links
> http://music.columbia.edu/cmc/music-dsp
> http://music.columbia.edu/mailman/listinfo/music-dsp
>
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to