Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
Geez... sorry for the brain fart in my posting above. 1/4 of a cycle is 90 degrees of phase, not 45. Try visualizing a sine wave that's sampled 8 times per cycle and rebuilt using 8 stair steps. Now suppose that some of the stairsteps risers come a teeny bit early or a teeny bit late and you'll get the picture.
Nice explaination 1439bhr! You seem to be the first poster who really seems to know completely what they are talking about (I include myself in the uninformed). Very well put and easily understandable. I did catch the 1/4 cycle but I knew what you meant. Great Job!
1439bhr, I got a question I am sure you can answer. Does the low pass filter on the output of the DAC also handle any aliasing that may be present? Also, how much phase noise is acceptable in the system before there is any degredation and does phase noise closer to the carrier or farther away from the carrier seem to have the most effect. Thanks in advance for your response.
Using a FIFO and reclocking is not the end all and is not so simple. When the Gensis Digital Lens came out, Bob Harley assumed that all transports would sound exactly the same. He found that he was wrong. Rather than reclock, the best solution, given the current CD methodolgy is to have a custom data transfer mechanism like Lvinson and Spectral. I believe that the SPDIF interface is inadequate for high quality playback. I did some testing about 7-8 years ago to prove that there was no difference in transports or digital cables and published that dat on the net. At the time I had only worked with proprietary digital record and playback protocols and was not that familiar with the CD playback mechanism. My tests proved just the opposite of what I thought. Not only that but subjectively the tests seemed to show that jitter was not the most important of the digital flaws, both read errors from discs and signal loss of high frequencies seemed to be more of a problem. While discman buffering is universal and basically required for a device that bounces around, everyone I've ever owned always said to turn off the buffering when not needed because the sound was not as good as the direct signal. I still believe that the fewer components in between the transport and the DAC the better. Buffering of CD data has all sorts of other issues that need to be handled like skipping forward / back etc that make it impractical since you always have to handle the case when the buffer is empty. All the sysytems I have designerd, and most commercial systems with proprietary protocols in general send raw data to the DAC and the DAC interface has the only clock and jitter is at an absolute minimum. Jitter is a fact of life in any digital medium. The use of the clock has been there since digital technologies in the 60s. I always get a kick out of manufacturers who claim jitter reduction devices that produce 0 (zero) jitter. It's physically impossible to accurately measure jitter anyway. We can improve the accuracy, but there is always some small error in the measurement. This error will decrease as technology improves. By the way jitter errors have a much more detrimental effect than just causing pitch errors. Lack of perfect integrity of the audio signal effects soundstage and imaging and if jitter is so bad that the dac syncs incorrectly on the signal, severe static can be produced. See my previous postings.
I have read your post and can not dispute your claims with the exception with the exception of one. Jitter is not very difficult to measure accurately if you understand the measurement concept. Jitter and be made simply with a spectrum analyzer and a calculation. All that is needed to convert phase noise to jitter is a simple calculation. A spectrum analyzer can be used to measure phase noise if the device to be tested has little or no AM because the analyzer can not tell the difference between AM and phase noise.