What a lovely thread - it has passion, enthusiasm and complete BS all
rolled into one :) Let me try to clear some of it up; apologies to
those of you for whom this is all basic stuff, but I've yet to see a
good explanation of signal integrity in an audiophile forum, and
there's an awful lot of vagueness and pedantry around with remarkably
little science to back it up.

With any digital data link, there are always two aspects to consider:
the data itself, and the associated clock. Let's consider them
separately for a moment.

The data is the easy bit. It's just ones and zeroes, and as long as
there isn't so much noise on the wire that they actually get
misinterpreted, it's easy to reliably recover -exactly- what was
transmitted. It doesn't matter whether they originally came from a CD,
or a file on a hard disc, or over an Ethernet or wireless connection.

Then there's the clock, which is more complicated. In some
communications systems, the clock is carried on a separate wire, and
the receiver samples the data whenever the clock changes from low to
high or from high to low. If all you're doing is storing the data in
memory or forwarding it on to another device, that's all there is to
it. As long as the clock transitions line up with the data bits, the
link works. Zero degradation.

In many modern systems (Ethernet, USB, S/PDIF), no separate clock wire
is used. Instead the clock is recovered by looking at the timing of
transitions in the data. In the case of something like Ethernet, where
all you care about is getting the data from A to B reliably, this also
works fine.

The problem comes when you have to start caring about not just getting
the bits from A to B, but also about exactly when they arrive at their
destination. This is the problem with S/PDIF - you need to play the
music at the same rate it comes in.

A CD is sampled at 44.1kHz. But, no oscillator in the world (and
certainly, no oscillator in your hi-fi) ticks at precisely that speed.
Standard tolerance on a quartz crystal is +/-50 parts per million,
which is no problem in itself - you can't hear the difference if you
CDs are played back at 44.1002205kHz instead.

However, say your DAC were running 50ppm fast and your CD transport (or
whatever) were 50ppm slow. About 4 times per second the DAC won't have a
sample to play, and you might hear this as a click. Not very hi-fi, I'm
sure you'd agree. So, there has to be a mechanism to ensure that the
clock in the DAC runs at precisely the same speed as the one in the
source component. Because S/PDIF is unidirectional - it only provides a
path from source to DAC and not the other way round - it has to be this
way.

When music is digitised, samples are taken at precisely determined
intervals by very expensive studio equipment, and so to reproduce the
original signal as accurately as possible, the output from the DAC has
to be updated equally precisely, so that the time interval between
successive samples is the same as it was originally. Variation in this
period between samples is what we all know and love as jitter.

[/i]The only place this jitter matters is at the DAC chip itself.[/i]
In a device like a Squeezebox, big bursts of data are sent over the
network into a buffer memory, then it's broken into smaller packets and
stored in a FIFO (first in, first out) buffer by the CPU, and finally
clocked into the DAC a bit at a time at regular intervals. It's only at
the point where the last bit is clocked in and the DAC updates that
jitter makes any difference whatsoever. If an external DAC is in use,
it's only the clock pulse that causes that DAC chip to update that
matters. Jitter elsewhere is basically a non-issue.

What does this have to do with S/PDIF?

This is all down to implementation. For the reasons explained above, a
circuit in the DAC has to recover the clock from the S/PDIF signal and,
from this, generate a clock to the DAC which is synchronised and yet has
the least amount of jitter possible. Typically this is done with a
circuit called a Phase-Locked Loop or PLL, and although they're very
good at rejecting jitter, they're not perfect. The more that's fed in,
the more comes out.

So, jitter on the S/PDIF link can lead to jitter at the DAC input,
which in turn can affect the sound. That's why all S/PDIF links are not
equal :)

The ideal is to have the master clock located in the DAC, not the
source. Then you can have a high quality, stable oscillator right by
the DAC chip itself, where it matters. But S/PDIF doesn't allow for
this, because there's no way for the DAC to control the rate at which
data is transferred. Bidirectional links like Ethernet, USB and
Firewire get around this problem. (I have a USB connected headphone amp
at work with its own built-in DAC. It sounds wonderful!).

Optical vs coax? Both can give rise to unwanted jitter. With an optical
cable, the signal from the phototransistor in the receiver (which is
what matters) is fairly small and its rise/fall time isn't
instantaneous - so there's uncertainty as to exactly when the
transitions between 1 and 0 have occurred. On the other hand, coax can
have sharper edges which are easier to time between. But it can also
pick up RF noise, which adds uncertainty back in, and there are a whole
host of transmission line effects which I won't go into now.

Hope that helps a bit :)

Andy


-- 
AndyC_772
------------------------------------------------------------------------
AndyC_772's Profile: http://forums.slimdevices.com/member.php?userid=10472
View this thread: http://forums.slimdevices.com/showthread.php?t=33146

_______________________________________________
audiophiles mailing list
audiophiles@lists.slimdevices.com
http://lists.slimdevices.com/lists/listinfo/audiophiles

Reply via email to