Phil Leigh;549300 Wrote: > It is simply NOT possible to emulate Jitter in DSP, since jitter is a > phenomena that occurs in time intervals that are less than one sample > duration, and there is no way in DSP software to manipulate such > things, since the basic "unit of work" for DSP is a "sample" While this is true, I think I can see what bhaagensen is getting at.
When jitter disturbs the timing of the samples, you have the correct samples at the wrong time. After post-DAC anti-alias filtering you end up with an analogue signal, and that could either represent the right samples at the wrong time (which is what happened), or the wrong samples at the right time. Let's suppose that we have a hypothetically pefect system with no jitter. In principle, it might be possible to amend the samples with DSP such that you have the wrong samples at the right time which, after post-DAC filtering, yield an analogue signal which is the same as it would have been had there been jitter and you had the right samples at the wrong time. Any disturbance of the final waveform is a form of distortion and/or noise. A given level of jitter will correspond to a certain level of THD+noise. I think it should be possible to use DSP to add such THD+noise. The question remains: what level of THD+noise is the equivalent of X ps of jitter? I don't know the answer, but my gut feeling is that a reasonably low jitter figure (say, 100ps) is probably equivalent to an extremely small amount of THD+noise - the sort of figure that any state-of-the-art analogue device would be proud of. -- cliveb Transporter -> ATC SCM100A ------------------------------------------------------------------------ cliveb's Profile: http://forums.slimdevices.com/member.php?userid=348 View this thread: http://forums.slimdevices.com/showthread.php?t=78790 _______________________________________________ audiophiles mailing list audiophiles@lists.slimdevices.com http://lists.slimdevices.com/mailman/listinfo/audiophiles