Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
1439 - I agree with your point 100% and have made it myself many times - if differences are heard using different cables, the bits must be being altered (a bad thing)and if they are, then we as consumers should be demanding a better technology for the interface, not spending a bunch of money on cables and transports. As you say, you can transfer bits perfectly in the computing world with good quality but relatively cheap cabling and absurdly inexpensive hardware. If a DAC with an ethernet input interface was available, it would be easy to set up a whole-house music distribution system that performs as well (or better, if transport technology is as spotty as it would appear it is) as the best transports. Hopefully such an interface isn't too far off in the future.
You make a mistake if you think that just transmitting the bits accurately is all that is required. Jitter (or time-based distortion) is irrelevant when there is no need for time syncronous transmission - ie. computer communications. But in audio or video you must deal/live with time-based distortion. Don't fall for the marketing BS that says a Levinson DAC or a Discman eliminates time-based issues through buffering. And by the way, don't believe that doing away with cables eliminates the problem either - otherwise we would have stuck with the three-in-one music centres of the 60's - Bmpnyc, your dream came true forty years ago.
bmpnyc... no offense taken. If there's a physical effect, then there's a physical cause. If there's no physical cause, there can be no physical effect. Those of us with the scientific mindset seek to comprehend the linkage. Understanding the linkage is every bit as enjoyable to me as other intangibles such as "beautiful design" and "great build quality".
Redkiwi - you're right that having time synchonization requirements makes the environment more demanding. However, as long as you have 1) a redundancy scheme and 2) sufficient resources above and beyond the demands of the basic application to support the redundancy scheme, then you can effectively eliminate the time synchronous demands. The Levinson DAC / Discman buffering doesn't eliminate it because there's still no redundancy - if they send the data and it's not received correctly, there's no recovering the lost data. But if I have a 100Mbit ethernet connection and have to keep up with only the bandwidth necessary for CD playback, I can send / resend the data dozens of times if need be and still keep up. If I can transfer files across a LAN perfectly accurately at 10Mbit/sec, I should be able to transfer music "files" perfectly at a rate of 1.5Mbit/sec. If current transport /DAC interconnect technology can't perform this same feat, we should demand better.
First of all, the different digital interfaces have different bandwidth. Within a single intrface type, even slight imperfections can cause signal loss. Only ATT optical has enough bandwidth to handle all the dat correctly. Yes this lame by todays standards, but it was leading edge 20 years ago.