Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
I have read your post and can not dispute your claims with the exception with the exception of one. Jitter is not very difficult to measure accurately if you understand the measurement concept. Jitter and be made simply with a spectrum analyzer and a calculation. All that is needed to convert phase noise to jitter is a simple calculation. A spectrum analyzer can be used to measure phase noise if the device to be tested has little or no AM because the analyzer can not tell the difference between AM and phase noise.
The Levinson DAC provides SPDIF, AES/EBU, and Toslink input (and ATT, I believe) interfaces. It does not support other "custom" interfaces such as I2ES, which attempt to deliver accurate, jitter-free clock signals to the DAC. Problems with the transport such as dropped data bits, etc., is a separate issue from the effect of digital cables. A back-of-the envelope calculation reveals that clock jitter needs to be less than 100 picoseconds to ensure that all jitter artifacts are at least 96 dB down from a signal at 20 kHz (the most stressing case). Bob Harley notes in his book "Guide to High End Audio" that several popular clock recovery chips, based on PLLs, produce about 3-4 nanoseconds of jitter (about 16 dB worse than the 100 ps reference point). Delivering a 44.1 kHz clock signal to a DAC and assuring jitter less than 100 ps has a stability of about 4 parts per million, which is achievable with competent design. Devices such as a Digital Time Lens, which provide an SPDIF output, remain at the mercy of clock recovery jitter at the DAC. The best they can hope to do is to stabilize the transmit end clocking as seen by the clock recovery circuit.
Gmkowal 2 points here. It ought to be obvious that your test is inadequate, simply use a CD source. It should take about 4 seconds if memory serves me to fill a 1 meg buffer of SPDIF data. That's real time. When I say jitter can not be measured accurately I'm simply pointing out that any device that measures jitter has to have a clock, that clock has jitter, so your measurement has to be off by some amount. That amount may be small but it always exists. My point was that many claims of jitter reduction devices are total BS. 1439bhr I thought the Levinson also had a proprietary data transfer interface. My point was that some component combinations do and they seem to do the best job. My points were simply that jitter reduction devices are a poor substitute for a good implementation between a DAC and transport. They may even incrrease jitter. My second point is that the jitter on most "good" systems is low enough to not be a major factor in the sonic degredation. I believe jitter is third behind errors in the transport and signal loss of high Khz signals.
Blues, You are absolutely correct! The purpose of my test was not to simulate the real time data transmission of a digital audio playback system but merely to prove to a poster that the digital cable has little to do with making bits fall out. I chose to do an experiment so I would have physical evidence for what I was claiming and it was fun to do. I do not have the time nor the equipment to simulate a real life situation. I also agree that jitter is not a major contributor. The only thing I do take exception with is your claim that a clock is needed for jitter measurement. A spectrum analyzer can be used to measure phase noise and a calculation can be made to get jitter from the phase noise measurement. The calculation is well documented and the Ma Bells of the world have been useing this method for years. Thanks for your post and happy listening.
Thank you Bluesman, 1439, Gmkowal, Redkiwi and all of the above. The technical data strained my patience, but was well worth following. I am sure that some mysteries will always remain, but with things being this complicated, I will have to continue using my ears as my guide. I wonder if the sound that I prefer would show a smaller amount of the problems pointed out here? Very interesting how small pitch shifts effect the soundstage, Bluesman.