Hi, I'm the original poster on this thread.  Thanks for the thoughtful
comments, everyone.

First off, I appreciate Phil Leigh's insights; he pointed out two
potential mechanisms that could potentially generate an audible
difference, reminding me that that both paths start with the optical
disc.  That is

path A: optical disc --> Nakamachi CD player optical pickup subsystem
--> decoding and error correction (single pass)--> synchronous
bitstreams --> S/PDIF waveform encoding --> TacT 2.1S processor -->
Mark Levinson 360S DAC

path B: optical disc --> computer CD-ROM drive optical pickup subystem
--> decoding and error correction (with optical drive retry, if needed)
--> harddrive --> S/PDIF waveform encoding --> TacT 2.1S processor -->
Mark Levinson 360S DAC

Path B will likely generate the more accurate data, as it can retry
reading bad data, wheras it's typical for a CD player to "press on
regardless" and use tricks such as interpolation to bridge over known
bad data.  Different data means different audio, and it's reasonable to
think that more accurate data will correspond to a more appealing aural
impression. 

As a previous poster suggested, recording and comparing S/PDIF data
should be able to confirm or reject the hypothesis path B data being a
more reliable representation of the music. What I would expect to see
is rock solid repeatability in the S/PDIF stream originating in the
server, and run-to-run variability in the datastream originating from
the CD player.  Now, if only I had an S/PDIF recorder...    

Another point Phil makes is that path A has potential for a noisier
S/PDIF signal, due to EMI from the CD player's electromechanical
components.  Noisier S/PDIF signal can result in increased jitter in
the recovered data clock, and increased DAC clock jitter is known to
make for a less pleasing aural experience. 

JezA strongly suggests that I try taking my TacT processor out of the
path and seeing if that makes a difference.  That's a reasonable
troubleshooting experiment, and I will try that as opportunity permits.
I don't expect a difference, though.  That's because I'm using the Tact
processor in digital-in / digital-out mode and there never is any D/A
conversion occurring inside the box.  True, there is clock recovery
happening as part of the conversion from S/PDIF waveform back to
digital data stream for processing, but I do not expect that jitter is
affecting the integrity of the bit stream content (i.e., I do not
expect misclocking of data and changed data words -- this would be
HIGHLY audible as impulsive pops in the audio when it is converted to
analog downstream).  Also, the Mark Levinson 360S employs a FIFO buffer
and locally generated clock that obviates the S/PDIF waveform (generated
at the output of the TacT) as a source of clock jitter. 

So, no smoking gun yet that confirms or refute my impression, but at
least there's a plausible (to me) hypothesis on what's going on.  

--Steve


-- 
bhr1439
------------------------------------------------------------------------
bhr1439's Profile: http://forums.slimdevices.com/member.php?userid=22189
View this thread: http://forums.slimdevices.com/showthread.php?t=57173

_______________________________________________
audiophiles mailing list
audiophiles@lists.slimdevices.com
http://lists.slimdevices.com/lists/listinfo/audiophiles

Reply via email to