Why do digital coax cables matter?


Could somebody please explain this to me? Why does a $100 dollar cable sound better than a $50 cable. Why is silver coax better than copper coax? Why do the quality of connectors matter in the digital realm?

I'm currently needing a cable for a Stello U3. Some people claim that are no discernible differences out there b/t different levels of coax cables. They say the only things that matter are impedance,cable length, and adequate shielding.
robertsong
let me add the obvious:

take 2 audiophiles, 1 stereo system and 2 digital cables. place them in a room and have them compare the two cables.

there is a chance that they will agree on what they are hearing but they may disagree on which they prefer.

there is a chance that they will disagree on what they hear but agree on their preferences

there are two other obvious possibilities.

so what's the point.

there is no definitive answer to the question posed, because, perception and preference may differ among audiophiles.

in the empirical world, just listen and decide for yourself. it's not an original thought.
Many of us simply think that, "hey, digital is nothing more than ones and zeros, so just as long as those ones and zeros get to their destination without changing state or being corrupted, a perfect transmission will occur and no sound difference can possibly be heard, end of story."

As explained by a couple of responses above, there is one aspect of digital transmission that many audiophiles don't get. It is called [drum roll please]...timing.

Those ones and zeros must enter the DAC chip at exactly the right time to be converted into the proper analog waveform shape. If the timing is off (by mere picoseconds) the constructed waveform will be there, but not exactly the correct shape it should be. And that is where much of the sound differences of different cables come into play (and CD transports, etc., for that matter.)

I suspect that, for various reasons, all digital cables have slightly different timing characteristics. Whether your DAC chip and associated circuitry is compatible with a particular timing characteristic determines the sound outcome. The cable's timing, and whether the DAC "likes" it, is not dependant on the cost of the cable.

Stop thinking so simplistically, digital audio is not just about the ones and zeros - that is only part of the story.
Jitter is measurable, correct?

Will a cable of some determinate length not add some measurable, repeatable, non-arbitrary amount of jitter within a particular range of measurement, regardless of any jitter coming from the source component?

Are there any cable manufacturers that measure and publish jitter specifications for each of their different cable products and cable lengths?
Seasoned got it right, but I still wonder with current state of digital technology if in practice it is really that much of a problem with most modern gear. Digital technology in this regard has come a long way since the CDs outset around 30 years ago. That makes a big difference.

Of course to whatever extent it may still be a problem in practice, audiophiles will care more about it than most normal people.

In my case, to date, I would have to say that digital cable tweaks have made the least difference of most any tweak I have tried. Most others (digital and analog related) I hear a difference. With digital cables, I am still waiting. I have mostly compared optical versus coax to-date specifically. These are each significantly different so I expected to hear something but have not so far. I have also yet to hear a practical difference through the same DAC from various digital sources. I have compared several digital sources including Marantz DVD player, Denon CD player, ROku SOundbridge and Logitech Squeezebox. They all tend to sound similar and essentially equally good to the point where I determined it did not matter to me.
Will a cable of some determinate length not add some measurable, repeatable, non-arbitrary amount of jitter within a particular range of measurement, regardless of any jitter coming from the source component?
No, absolutely not. As implied in some of the preceding posts, the amount of jitter that will result with a given cable in a given system, at the point where D/A conversion is performed within the DAC (which is where it matters) depends on a complex set of relationships and interactions between the parameters of the cable, including length, impedance accuracy, shielding effectiveness, shield resistance, propagation velocity, bandwidth, etc., and the technical characteristics of the components it is connecting, including signal risetimes and falltimes, impedance accuracy, jitter rejection capability, ground loop susceptibility, etc.

Many of the relevant component parameters are usually unspecified, and even if they were specified in great detail predictability of the net result of those interactions would still be limited at best.

Regards,
-- Al