USB cable hype


Can someone explain the need for expensive USB cables for short runs? The only parameter of concern is impedance. I personally have verified error-free transmission in the Gbps range regardless of cable make/model as long as the cable length is short. There is no magic. It is just about impedance control to minimize loss and jitter. This is inexpensive in the MHz range. I will pay more for a cable that it is well built. I will not pay more for hocus pocus.
axle
Axle, I remember from Stereophile test of AE that jitter on analog (jitter artifacts) were much worse (about 5x) than optical out. USB cable that does not use +5V supply for anything should not have it. Also shielding should be as good as possible.
Optical is a good choice. Locate the source far away from the DAC, use an optical cable, and you have no noise. Use an asynchronous DAC with error correction and you have no jitter and no errors.

No noise, no jitter, no errors! What more could you ask for? Too bad optical is SPDIF. I guess there is one more thing to ask for: optical USB.
"No noise, no jitter, no errors! What more could you ask for?" While those are good goals to theoretically try to achieve, I'm sorry to say that in the real world you will never completely eliminate noise, jitter, or errors. You can minimize them by good engineering design. Sorry to be the bearer of bad news, but don't shoot the messenger!
Bill, I hear you. But I am optimistic that we are getting to that point where they won't matter.

Optical has dispersion and the O/E converter generates noise. Therefore, you are technically correct that you can't eliminate noise altogether. But optical completely isolates noise from the source, as well as EMI and RFI that are otherwise introduced through an electrical cable.

All digital generate their own internal jitter, including asynchronous DACs. Therefore, you are technically correct that you can't eliminate jitter altogether. But you can completely isolate source jitter.

Bit errors will always exist. But they can be corrected.

In summary, we have ways to correct bit errors 100%, to completely isolate source noise, and to render jitter irrelevant. What we don't have is all three in one design ... yet.

Of course, this is limited to the DAC. Even if the DAC signal is pristine it can be contaminated by EMI/RFI upon leaving the DAC.

But that's a different world we are talking about.