Digital XLR vs. Analog XLR - Balanced Cables


What is the difference between a digital XLR/balanced cable and an analog XLR/balanced cable?

What if I used an analog XLR/Balanced cable to carry a digital signal from the digital output of one device to the digital input of another device?

Any risks/damage, etc. . .
ckoffend
As to characteristic Z and BW: First, the reason to set a characteristic impedance of a cable is to reduce transmission line effects. T-line effects amount to standing waves. These only become important when the wavelength of the signal approaches the length of the cable – how far the signal has to travel. So, whether or not a given and specified characteristic impedance of a cable will matter depends on the cable length. So does the bandwidth of the cable, for that matter, because the total capacitance is determined by the length of the cable. As between the two, as I will explain below, the bandwidth is going to be more important, for the lengths we are talking about.

As to the reference to the Stereophile article – not exactly a reference that is going to add validity to a technical position when posing such position to an engineer. Next time try something a little more accepted in the scientific / engineering community – such as an IEEE journal, or even something published by the AES or the ARRL.

As to the transmission line effect, we are talking about interconnects here. I made the assumption that the lengths are somewhere in the neighborhood of less than 10 feet. T-line effects only kick in when the wavelength of the highest signal component approaches the length of the cable. Standing waves, if they are present, will tend to round off the edges of the square pulse, this is what causes the jitter due to T-line effects. The purpose of selecting the characteristic impedance to match the source and the load impedances is to get rid of T- line effects.

The highest signal component in the case of digital audio will be about 10 times the fundamental frequency of the signal because at that frequency you have a nicely shaped square wave.

The wavelength of a 100 MHz signal is just under 10 feet, so you really aren’t getting T line effects until you approach that cable length, if we are talking about a signal with 100 MHz components. A safe rule of thumb is a 1 to 10 ratio, so there one could argue that to completely eliminate the possibility of T-line effects the cable should be less than 1 ft long. However, the transmission rate of digital audio at a 96 kHz sample rate isn’t 100 MHz. If you go out two decades, you are still at only 10MHz, which is a wavelength of just under 100 ft. Hence, a 6 foot interconnect will not be a source of jitter due to T-line effects.

More likely (but still not very likely) is that the rounding of the pulse will be due to bandwidth limitations. If the cable has too high of a capacitance value, it is possible to create a low pass filter that will start rounding the square wave and create jitter. The chances of that happening are also slight at the lengths we are talking about, but more likely and does not depend on the creation of a standing wave. For that reason the bandwidth of the cable is more important. A subtle difference, but there is a difference.

But, on a practical side – it just doesn’t matter – a cable made for analog transmission will work find up to about 50 feet and most interconnects for home audio are not that long.

Remember also that jitter only becomes a problem at the conversion. Circuitry at the convertor should reconstitute the clock and reject jitter that is not extreme.

Is it a big deal, no – not if you are purchasing the IC’s new, I haven’t priced digital vs analog IC’s but there is no reason that one should be significantly more expensive than the other. Furthermore, since the 110 ohm low capacitance cable is not going to cost significantly more for 6 foot lengths and it will work just as well for analog, my guess is that reputable sellers simply make up all their cables out of the same cable and connectors and just charge a small amount more to sell you one rather than two cables; i.e. $ 60 /pair vs $35 each. Not unfair.
Whenever an electromagnetic wave encounters a change in impedance some of the signal is transmitted and some is reflected (impedance boundary). Reflected signal creates all sorts of shape distortions making overshoots, oscillations and staircase (Bergeron diagrams). Rule of thumb says that you can consider that line (cable) is in the low frequency domain when trise>6t where t is line delay. Signal travels thru conductor at about 70% of the speed of light making 1m in 4.8ns. Multiplying this by 6 gives us 29ns. for 2m interconnect it will be 58ns and for 3m it's 87ns (50ft would be disaster - 438ns) . Most of the output drivers switch below 29ns (much less 438ns)therefore we have transmission line effects. Selecting slower driver by designer wouldn't do any good because it creates noise induced jitter on the receiving end. Receiving end has either asynchronous reclocking in upsampling DACs or dual PLL in the rest of them. PLL, even dual, works poorly for fast jitter.

I still recommend Stereophile article - it might be not up to your standards (as an engineer and/or scientist) but at least it is not as boring as IEEE stuff and one can even understand it for a change. And it is audio related - have I mentioned that?
Funny, I have found IEEE journals to be understandable. As to the attempt at an explanation in your last post- it makes no sense.
I do not know what doesn't make sense to you. Frequency of transmitted signal in the cable has nothing to do with transmission line effect no matter what multiplier you put on the top of it. What is important is the highest slew rate appearing. You can transfer 10Hz square wave and still have transmission line effect. Slew rate of about 25ns is very common in the output driver (most of them) but some are even in order of 10ns. Using tr>6t is a very common test if line is not a transmission line and you will find it in many publications. Your 50ft cable (analog or digital) is a transmission line (very bad one) for typical output driver. An no - reflections do not round the edges but create whole havoc by creating overshoots, ringing, staircases etc. If you believe that cables make no difference just say so and do not bring pseudo engineering/scientific arguments here because somebody will always call you on it. As for IEEE - I don't read their journal but was at their meetings - not eager to go back.
Ckoffend - Maybe whole thing it is too technical but it is important to
understand that 192kHz signal is really transmitted at 25MHz. Maybe part of
article below will explain better why digital cables exist. I don't want to
engage more in discussion here since it's becoming counterproductive and
I'm signing off.

"This article is from the Audio Professional FAQ, by with numerous
contributions by Gabe M. Wiener.

5.8 - What kind of cable AES/EBU or S/P-DIF cables should I use? How long
can I run them?

The best, quick answer is what cables you should NOT use!

Even though AES/EBU cables look like orinary microphone cables, and S/P-
DIF
cables look like ordinary RCA interconnects, they are very different.

Unlike microphone and audio-frequency interconnect cables, which are
designed to handle signals in the normal audio bandwidth (let's say that
goes as high as 50 kHz or more to be safe), the cables used for digital
interconnects must handle a much wider bandwidth. At 44.1 kHz, the digital
protocols are sending data at the rate of 2.8 million bits per second,
resulting in a bandwidth (because of the biphase encoding method)
of 5.6 MHz.

This is no longer audio, but falls in the realm of bandwidths used by
video. Now, considerations such as cable impedance and termination become
very important, factors that have little or no effect below 50 kHz.

The interface requirements call for the use of 110 ohm balanced cables for
AES/EBU interconnects, and 75 ohm coaxial unbalanced interconnects for
S/P-DIF interconnects. The used of the proper cable and the proper
terminating connectors cannot be overemphasised. I can personally testify
(having, in fact, looked at the interconnections between many different
kinds of pro and consumer digital equipment) that ordinary microphone or
RCA audio interconnects DO NOT WORK. It's not that the results sound
subtly different, it's that much of the time, it the receiving equipment
is simply unable to decode the resulting output, and simply shuts
down."