Digital Coax Cable - Does the Length Make a Difference


Someone (I don't remember who) recently posted something here stating:
"Always get 2 meter length digital cables because I have experienced reflection
problems on short cables, resulting in smeared sound and loss of resolution."

With all due respect to the member that posted this, I'm not trying to start a
controversy, just wondering if others have experienced this.  

I will be looking for a Digital Coax cable soon to run from my Node2i to a Dac.
I really only need 1/2 meter. Not sure if a short cable like this is a problem or 
just a case of Audio Nervosa.  

ericsch
@ericsch  We want to avoid reflections from the end of the cable.  These reflections appear on characteristic impedance boundaries (changes).  Characteristic impedance is related to geometry and dielectric of the cable and is roughly equal to SQRT(L/C).
We want smooth fast transitions of digital signal.  Reflection can alter them by coming back and adding to the signal.  Typical transition in average CDP is about 25-30ns while the threshold (point of level recognition) lies in about half of that (mid voltage point).  So we don't want reflection back in 30ns/2=15ns. Beginning of the transition starts reflection.  It has to travel to the end of the cable, reflect and come back.  15ns is equivalent to about 3m at 5ns/m signal speed.  It means 1.5m each way.

Now, case for the short cable.  Reflections start appearing when travel time is longer than 1/8 of transition time (cable becomes transmission line).  In our case it would be 30ns/8=3.75ns - equivalent to about 0.75m.  For 25ns transition it would be closer to 0.5m   This distance includes internal wiring of the transport and receiver (DAC).  I would not go further than 1ft cable.  1/8 is a rule of thumb, but normally you would have to know the fastest transition point, calculate frequency, then the wavelength and take 1/10 of it. 
Digital cables have signals with very fast rise and fall times. In order to reduce reflections from interfering with the signal transmission, they are spec'ed with a fixed impedance (75 ohms in the case of S/PDIF) driving a load with the same impedance. Any variation in the impedance will result in a signal reflection. 

If everything were ideal, the cable length wouldn't matter. The problem is that there are minor variations in the impedance through the connector, PC traces, etc. at the ends of the connection. This will result in some reflection, although in a well implemented system, it is fairly minimal. 

It takes about 5ns for the signal to travel through a 1m cable, so the reflection will propagate back to the source end of the cable and then back to the destination in 15ns since it's traveling through 3 lengths of cable. 

As I understand it, S/PDIF rise/fall times are typically in the 20ns range, which means that with a 1M cable, the reflection from the start of the square wave is going to arrive around the same time that the primary signal is going passing through the threshold where the signal is sampled. This can cause a perturbation in the signal which interferes with the timing - essentially adding jitter. 

If you use a really short cable, the reflection reaches the destination early enough in the rise (or fall) of the signal to not interfere with the timing. Likewise, if the cable is long enough, the reflection will arrive after the signal has passed through the threshold voltage so it won't cause a problem. 

This is why it's recommended to use an extremely short cable (such as you'd get inside a CD player, or a long enough cable that the reflections don't interfere. The general guidance I've read is that using a cable of 1.5M to 2M will avoid this issue almost all the time.