Why do digital coax cables matter?


Could somebody please explain this to me? Why does a $100 dollar cable sound better than a $50 cable. Why is silver coax better than copper coax? Why do the quality of connectors matter in the digital realm?

I'm currently needing a cable for a Stello U3. Some people claim that are no discernible differences out there b/t different levels of coax cables. They say the only things that matter are impedance,cable length, and adequate shielding.
robertsong
Many of us simply think that, "hey, digital is nothing more than ones and zeros, so just as long as those ones and zeros get to their destination without changing state or being corrupted, a perfect transmission will occur and no sound difference can possibly be heard, end of story."

As explained by a couple of responses above, there is one aspect of digital transmission that many audiophiles don't get. It is called [drum roll please]...timing.

Those ones and zeros must enter the DAC chip at exactly the right time to be converted into the proper analog waveform shape. If the timing is off (by mere picoseconds) the constructed waveform will be there, but not exactly the correct shape it should be. And that is where much of the sound differences of different cables come into play (and CD transports, etc., for that matter.)

I suspect that, for various reasons, all digital cables have slightly different timing characteristics. Whether your DAC chip and associated circuitry is compatible with a particular timing characteristic determines the sound outcome. The cable's timing, and whether the DAC "likes" it, is not dependant on the cost of the cable.

Stop thinking so simplistically, digital audio is not just about the ones and zeros - that is only part of the story.
Jitter is measurable, correct?

Will a cable of some determinate length not add some measurable, repeatable, non-arbitrary amount of jitter within a particular range of measurement, regardless of any jitter coming from the source component?

Are there any cable manufacturers that measure and publish jitter specifications for each of their different cable products and cable lengths?
Seasoned got it right, but I still wonder with current state of digital technology if in practice it is really that much of a problem with most modern gear. Digital technology in this regard has come a long way since the CDs outset around 30 years ago. That makes a big difference.

Of course to whatever extent it may still be a problem in practice, audiophiles will care more about it than most normal people.

In my case, to date, I would have to say that digital cable tweaks have made the least difference of most any tweak I have tried. Most others (digital and analog related) I hear a difference. With digital cables, I am still waiting. I have mostly compared optical versus coax to-date specifically. These are each significantly different so I expected to hear something but have not so far. I have also yet to hear a practical difference through the same DAC from various digital sources. I have compared several digital sources including Marantz DVD player, Denon CD player, ROku SOundbridge and Logitech Squeezebox. They all tend to sound similar and essentially equally good to the point where I determined it did not matter to me.
Will a cable of some determinate length not add some measurable, repeatable, non-arbitrary amount of jitter within a particular range of measurement, regardless of any jitter coming from the source component?
No, absolutely not. As implied in some of the preceding posts, the amount of jitter that will result with a given cable in a given system, at the point where D/A conversion is performed within the DAC (which is where it matters) depends on a complex set of relationships and interactions between the parameters of the cable, including length, impedance accuracy, shielding effectiveness, shield resistance, propagation velocity, bandwidth, etc., and the technical characteristics of the components it is connecting, including signal risetimes and falltimes, impedance accuracy, jitter rejection capability, ground loop susceptibility, etc.

Many of the relevant component parameters are usually unspecified, and even if they were specified in great detail predictability of the net result of those interactions would still be limited at best.

Regards,
-- Al
We need to look at the whole system. Cable impedance is designed to 75 ohm to match everything else in the signal path. However, everyone knows nothing will be perfect, so it comes down to how close is the actual impedance to 75 ohm. How about the connectors, trace of PC board, etc. What matters is at the point where the clock information is recovered from the signal. If it is not exactly 44.1kHz, there is jitter and it causes distortion to the audio signal. Therefore, cable quality matters if everything else in the system closely matches to 75 ohm. If by accident, everything matches to 70 ohm, it will be good too, but chance for that to happen is very small.

Therefore, professional studio use master clock such that only the 1 and 0s are decoded from incoming signal and not the clock information. In that case, jitter can be minimized to just between the DAC and master clock generator. Then most effect from the cable can be eliminated. A few high end decoder manufacturers adopt this setup like Esoteric and DCS for home use. It is a much more expensive solution, but it works.

Cheers and enjoy music