chill;194599 Wrote: 
> 
> In case I'm giving the wrong impression here, I'm really not being
> sarcastic.  I've often wondered how improvements to the transmission of
> the bits changes things, since it's my impression that the receiving end
> simply has to decide whether it's received a 'one' or a 'nought'.  If
> the waveform is so bad that bits are misinterpreted then I can see how
> the sound would be affected, but how bad does it have to be, and in
> what way, before the 'ones' and 'noughts' are not properly interpreted.
> Does the jitter in the bit stream mean that bit boundaries are
> misaligned by one whole bit for instance?  Does a mis-shapen leading
> edge cause the receiving end to interpret a 'one' as a 'nought'?
> 

You have to remember that in S/PDIF, the clock for the DAC is
reconstructed using the arrival time of the bits.  So if there are
variations (jitter) in the time the rising edges of the signal arrive,
the analogue output will be distorted even if there are zero actual bit
errors.

That said, even a very sharp feature at 2.4 GHz would alias down to an
extremely broad spectrum at audio frequencies, no?  So it's hard to see
how such high frequency noise could do anything other than contribute a
little bit to the noise floor.  Am I missing something there?


-- 
opaqueice
------------------------------------------------------------------------
opaqueice's Profile: http://forums.slimdevices.com/member.php?userid=4234
View this thread: http://forums.slimdevices.com/showthread.php?t=34406

_______________________________________________
audiophiles mailing list
[EMAIL PROTECTED]
http://lists.slimdevices.com/lists/listinfo/audiophiles

Reply via email to