darrenyeats;427085 Wrote: > If we take the Benchmark as an example they publish specs showing that > output distortion doesn't change as input jitter is increased. The > take-home point is that the measurements (whatever you think of them) > don't change when the input jitter changes. > > http://www.benchmarkmedia.com/system1/files/documents/DAC1.pdf > > Either they are printing drivel or you don't need a re-clocker. There > doesn't appear to be a middle ground. I don't have a Benchmark and I've > no plans to buy one so I've no point to prove. I'm just saying. :) > Darren
This is what I want to believe, that the jitter issue is resolved by using a good DAC with low jitter measurements. If this was simply the case however, then why would the minimum length of the digital cable necessarily be a factor; if the DAC would correct any jitter issues, why even worry about whats upstream from the DAC? Discussions regarding the importance of minimizing jitter in the digital cable (i.e.: lengths of 1.5 meters or more), or by re-clocking the signal before the DAC, leads me to believe there is more to the issue of reducing jitter than simply incorporating a low-jitter DAC. I would like to have a better understanding -- timequest ------------------------------------------------------------------------ timequest's Profile: http://forums.slimdevices.com/member.php?userid=25640 View this thread: http://forums.slimdevices.com/showthread.php?t=63796
_______________________________________________ audiophiles mailing list audiophiles@lists.slimdevices.com http://lists.slimdevices.com/mailman/listinfo/audiophiles