> Respectfully, I think you are fooling yourself when you say you 'hear' > a difference when the audio is delivered from your disk to your sound > card faster. It's the same audio delayed by 5.6 mS. As long as the > system is set up correctly the data is completely unchanged except for > the latency of getting it delivered. Since you ___cannot___ hear > 'delay' except by relative measure - and you have ___no___ relative > measure when doing only playback, it's impossible to hear whether I'm > running at 1.2mS, 5.6mS or 25mS.
I love listening to music and I've been interested in stereo performance for many years. Lately the computer has gained acceptance in the "audiophile" world as a legitimate source component and people with access to the very best stereo equipment have reported that a properly-configured computer beats any CD player with regard to sound quality, even ludicrously-priced ones. Because of this, audiophiles are switching over to computers and websites like www.computeraudiophile.com are surging in popularity. If you browse the forums there, you'll find people discussing the relative sound quality of hard disks, USB cables, and other computer components. I always ignored those threads. After all, 1 is 1 and 0 is 0. Then I read a post by Gordon Rankin in which he claims to have measured higher jitter from "slow" computers. Gordon is the owner of Wavelength Audio and generally considered to be the best designer of USB DACs. Then I read a post by Barry Diamente in which he claims that WAV sounds better than FLAC. Barry is an extremely well-regarded mastering engineer and a very level-headed guy. He didn't have an explanation for it, but he was sure it was true. I decoded some of my FLAC files to WAV and compared. Sure enough, I could detect a small difference. Then I read a post claiming that the real-time Linux kernel makes a small difference in the sound quality. After implementing real-time myself, I've also found that to be true. According to my old concept of computer audio playback, none of these things should be true, but various people on the internet (some of them respected professionals) have found them to be true, and I have heard them for myself. I found this: http://www.linuxfordevices.com/c/a/Linux-For-Devices-Articles/The-Linux-real-time-interrupt-patch/ "With a measured worst case latency of five microseconds and with a typical jitter below one microsecond at an interrupt period of up to 100 kHz an rtirq-enhanced linux kernel may be usable for a broad range of hard real time control loop applications." It sounds like real-time lowers both latency and jitter. Jitter absolutely affects sound quality. I wonder if jitter could be the cause of all this. I'm having some trouble with rtirq and I'm going to start a separate thread about it. Thanks for reading, Grant
