Digital XLR vs. Analog XLR - Balanced Cables


What is the difference between a digital XLR/balanced cable and an analog XLR/balanced cable?

What if I used an analog XLR/Balanced cable to carry a digital signal from the digital output of one device to the digital input of another device?

Any risks/damage, etc. . .
ckoffend
I do not know what doesn't make sense to you. Frequency of transmitted signal in the cable has nothing to do with transmission line effect no matter what multiplier you put on the top of it. What is important is the highest slew rate appearing. You can transfer 10Hz square wave and still have transmission line effect. Slew rate of about 25ns is very common in the output driver (most of them) but some are even in order of 10ns. Using tr>6t is a very common test if line is not a transmission line and you will find it in many publications. Your 50ft cable (analog or digital) is a transmission line (very bad one) for typical output driver. An no - reflections do not round the edges but create whole havoc by creating overshoots, ringing, staircases etc. If you believe that cables make no difference just say so and do not bring pseudo engineering/scientific arguments here because somebody will always call you on it. As for IEEE - I don't read their journal but was at their meetings - not eager to go back.
Ckoffend - Maybe whole thing it is too technical but it is important to
understand that 192kHz signal is really transmitted at 25MHz. Maybe part of
article below will explain better why digital cables exist. I don't want to
engage more in discussion here since it's becoming counterproductive and
I'm signing off.

"This article is from the Audio Professional FAQ, by with numerous
contributions by Gabe M. Wiener.

5.8 - What kind of cable AES/EBU or S/P-DIF cables should I use? How long
can I run them?

The best, quick answer is what cables you should NOT use!

Even though AES/EBU cables look like orinary microphone cables, and S/P-
DIF
cables look like ordinary RCA interconnects, they are very different.

Unlike microphone and audio-frequency interconnect cables, which are
designed to handle signals in the normal audio bandwidth (let's say that
goes as high as 50 kHz or more to be safe), the cables used for digital
interconnects must handle a much wider bandwidth. At 44.1 kHz, the digital
protocols are sending data at the rate of 2.8 million bits per second,
resulting in a bandwidth (because of the biphase encoding method)
of 5.6 MHz.

This is no longer audio, but falls in the realm of bandwidths used by
video. Now, considerations such as cable impedance and termination become
very important, factors that have little or no effect below 50 kHz.

The interface requirements call for the use of 110 ohm balanced cables for
AES/EBU interconnects, and 75 ohm coaxial unbalanced interconnects for
S/P-DIF interconnects. The used of the proper cable and the proper
terminating connectors cannot be overemphasised. I can personally testify
(having, in fact, looked at the interconnections between many different
kinds of pro and consumer digital equipment) that ordinary microphone or
RCA audio interconnects DO NOT WORK. It's not that the results sound
subtly different, it's that much of the time, it the receiving equipment
is simply unable to decode the resulting output, and simply shuts
down."
The last explanation about slew rate and ignores the relationship between the two. A little background in Fourier theory may clear up the lack of understanding as to the relationship between signal shape, frequency, and slew rate.

Your home audio equipment is not going to miss pulses by the changes caused by sending a pulse with fast rise and fall times through a path with a bandwidth typically available through analog interconnects.

As to T-line effects, the explanation misses the forest for the trees. The ultimate problem caused by standing waves is the rounding of the pulses.

The text proffered in the last few posts were simply lifted from elsewhere and offered as an explanation. But, they are out of context and inapplicable to the discussion at hand, which is whether digital vs analog IC's make any difference in home audio interconnects. If the person who initially posted the question is using the cables to transfer audio data in a typical fashion, i.e. from for eg. a CD transport to an outboard DAC, he doesn't need 110 ohm interconnect to do so - whcih was my original statement.

What is amazing is that this posts from Kijanki started out with a statement that bandwidth made no difference when it comes to jitter, but yet now offers quotes that refer to the importance of bandwidth. The reason for this contradiction appears to be a lack of a firm grounding in the meaning of the terms and effects discussed i.e. slew rate, frequency, bandwidth, and t-line effects.
Ok...So in a pistachio shell, say you have the option to use either an AES/EBU digital 110 ohm cable OUT from a USB to SPDIF converter to a WYRED4SOUND DAC2 AES/EBU input OR a 75 ohm digital SPDIF interconnect from same USB to SPDIF converter into same W4S Dac2. I keep hearing two tales: 1) that there's no difference since both are digital and 2) that there is a difference, claiming the EAE/EBU connection is superior.
Any opinion or truth is truly appreciated.

Thanks.
I've used an analog XLR in place of a digital XLR temporarily several times with no problems.