Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
Blues_man - Given the inadequacies of the CD protocol and your experience in the area of digital audio data transfer interfaces, what are the benefits brought to the environment by using a high-end digital interconnect over a basic, well-built digital interconnect? That was really the question at the beginning of this post, and I think we're all still curious. -Kirk
The problem is really in the transmitter and receiver. Even though there is a "standard", there are always difference in implementation of the hardware. I started out believing that there was no difference in digital cables. That was before I was familiar with the CD standard which isn't very robust. After a while I started measuring lots of cables and interfaces just out of curiosity. Its not easy to determine which cable sounds best with a particular transport / DAC combo. You have to rely on what other people have tried. First definitely goe with AES/EBU interface over RCA cable. It's not that much more. The one I liked best was the MIT reference. Its very expensive, $800, I believe its not worth the money but if cost is no object. I'll be putting the one I tested for sale here soon (at least half off). As I posted in another thread, the best thing is to get a Transport / DAC with a custom interface. I got the Spectral because I thought it sounded best. If you have access to a scope, use the method I suggested above with a high frequency tone. Whichever cable gives the most accurate signal is probably the best.
I understand that the transmission is real time. I just do not believe a cable that is in good working order will cause a bit or two to be dropped. I agree with Blues_Man that the problem is probably in the transmitter or reciever if bits are being lost. I ran a quick test on my 3 digital cables at work today using a logic analyzer ( a scope is not the right tool for a real time test) with a pattern generator and deep memory (16 meg). I simply output a 16 bit burst every .5 seconds. The rep rate within the burst was set to 10KHz. I simply tied the pattern generator's clock to the logic analyzer's clock so that every time the pat gen's clock went high I captured the data until I filled up the memory and saved the data. I tried this with alternating 0's and 1's, all 0's , all 1's, a pattern of 0110111011001111 and it's complement. Once I had captured the data I saved them as ASCII files and I wrote a small visual basic program to look for missing bits in the patterns and found none. I also fed a repeating pattern of 0's and 1's into the cables and terminated the cable with what I approximated was the impedance of my D/A. I looked at the waveforms with a scope and looked for any droop, overshoot and undershoot. The risetime of the pulses appeared to be close on all 3 cables but I did notice some some other differences. I noticed one cable inparticular did cause more overshoot than the rest but when I varied the impedance used to terminate the cable I could minimize the overshoot (probably more capacitance in this cable causing an impedance mismatch). I marked this cable and gave all 3 cables to a another engineer who has a a separate DAC and transport to take home to see if the cables sound any different from one another. I am sorry but I did not hear very much of a difference between the cables to begin, with but I thought this would be a more subjective test. As for real time and loosing bits, the logic analyzer does not lie. I will let you know what he thinks. I can not think of a better way to test the real time data transmission characteristics of a cable. I burned up today's lunch time trying this, tommorrow I think I will have something to eat :-)Thanks for the post
One more thing, the logic analyzer used actually had 1 meg memory, not 16 meg. Someone borrowed my 16 meg aquisition module.
I no longer have an analyzer, but whebn I did it was oked up to CD players and Transport / DACs. You just can't send bit patterns with your equipment because its probably far more sophisticated than what's in a CD playback system. Remember you also have to send 44,100 samples per second. A scope works fine at the output of the DAC to determine differences in the analog output of test signals. If you use the same transport, same DAC and same source, any differences must be from the cable. Also disconnect the destination end of the cable and put a volt meter on it, notice the large differences.