Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
I think I am repeating myself, but continue to get replies which imply that the only reason why digital cables could sound different is bit errors. This is not the reason at all - the reason is jitter - noise-based and time-based distortion. The problem is not about distortion causing a DAC to read a 0 as a 1, or a 1 as a 0. It is about the fact that we are talking about real-time transmission and that a DAC produces harmonic distortion at its output when the arrival times of the 0s and the 1s are not perfectly, regularly spaced. I really am having trouble saying this in as many different ways as I can. It is not about redundancy so that when an error occurs the data can be resent - we are not talking about data packet transmission here. Bandwidth capability is in fact an issue here. Even though the bandwidth for data transmission is low by most standards, if the cable was only just able to transfer the data accurately then the square waves would be very rounded indeed and jitter errors at the DAC would be enormous. Higher bandwidth cables allow sharper corners to the square wave with less undershoot or overshoot. Optical cables are also free from earth noise adding to the signal. It is not about bit errors, it is about timing based distortions. I work with loads of PhD telecommunications engineers but their grasp of these concepts is slight at best, because it is irrelevant for the audio fidelity needs of telephony and irrelevant for data packet transmission. But the best of them acknowledge that their training is insufficient for high quality audio.
Redkiwi: I get it. It is not only that you say yes and no, but "how" you say it as well. As far as I know nothing is perfect and there are always variences to be had. The sub atomic clock in Denver is pretty accurate, but I would assume that DAC's and digital transmission lines are not even in the same ballpark. Perhaps when we get organic based DAC's all cables will sound the same?
Redkiwi, the overshoot and undershoot you speak of is caused by capacitance in the cable. While overshoot and undershoot themselves may not necessarily effect the DAC output signal, the capacitance in the cable may effect the data pulses risetime. This effect, I would assume may be audible. While jitter is a degrading factor in a system of this type the transmission cable is not likely to increase or add jitter to the bitstream. I feel capacitance is the real culprit here. The less capacitance in the cable the better.
Ehider, I am still the skeptic, but I appreciate your viewpoint and candor. BTW, Is your employer Burr Brown? Thanks for the reply.
I am not sure we are disagreeing - over and undershoot being a capacitance issue, is the same thing as saying it is a bandwidth issue (but not necessarily the reverse). So far as I know there is no capacitance issue wrt glass cables, but they sound different from one another.