USB cable hype


Can someone explain the need for expensive USB cables for short runs? The only parameter of concern is impedance. I personally have verified error-free transmission in the Gbps range regardless of cable make/model as long as the cable length is short. There is no magic. It is just about impedance control to minimize loss and jitter. This is inexpensive in the MHz range. I will pay more for a cable that it is well built. I will not pay more for hocus pocus.
axle
My OP asked about need not preference for expensive USB cables. This could have been posted more clearly. Need is about functionality. Preference is about whatever floats your boat including SQ, looks, durability, and reliability. But, I am glad that we are talking about both. Because if it comes down to bit errors (I think it does), what I first viewed as a preference may actually be a need. I can further elaborate on this by posing a question and then answering it.

If bits are either 1s or 0s, how can USB cables provide a gray scale in SQ?

Short answer:

Bit errors due to insufficient margins.

Long answer:

I believe digital audio has the misfortune of being a real-time application that uses designs intended for non-real-time applications. In real-time applications, there is one chance to get it right. Large margins are specified for all component designs in order to prevent bit errors. Typical USB applications (data transfer) are not real-time. They do not require large margins because bit errors are acceptable. You have the luxury to either retransmit until the received packet is error-free or you apply forward error correction to fix the error. We do not have this same luxury in audio. While music will flow with bit errors, those bit errors distort the analog wave shape (lower SQ).

Of course, isolation is also an issue. Al has an excellent post above that discusses isolation. But in this case, the cable solution is a bandage not the issue. Therefore, while I fully agree with you Al, I left this for another time.

My guess is that one USB cable may provide better SQ over another if it has better analog properties that prevent eye closure and transmit fewer errors. Having said that, what is good enough? Is it the cable that costs another $50 or $500? I think the cable industry intentionally does us a disservice of keeping us in the dark in order to sell into that ignorance. The test would be simple: Pick a nominal setup and test for data rate vs bit error rate. Then you pick an acceptable bit error rate (let's say your typical song is 4 minutes so you pick BER < 1 per 5 minutes) and select the least expensive cable that does not exceed that bit error rate.
The only topic that's more pointless and inciting than a ' do cables make a difference'. ....is a 'do digital cables make a difference'...it's pointless. You will get the same answers and rationale you always get from each camp. This is one you simply must find out for yourself.

That being said...you question came across as more of a statement.
At least my post is seeking a discussion which has value, even if you don't think so. On the other hand your post has no value. Troll.
Axle, asynchronous DAC controls the timing. Data coming from the computer is placed in the buffer. Every frame computer adjusts number of samples in the frame based on buffer under/overflow signal from the DAC. DAC takes data from the buffer and writes it into D/A converter using internal stable clock. Because of that jitter does not even apply here. It is possible tough that ambient or computer electrical noise can enter DAC thru the cable. USB cables carry power that is not needed and can be source of such contamination. Ethernet is pretty much the same story - data comes without timing in packets so cable should not matter, but people reported improvement when moving to better shielded cables. I suspect that the same thing takes place - cable picks-up ambient electrical noise and injects it into DAC affecting internal clock thus jitter. Jitter converts to noise in frequency domain.

I don't have USB DAC so my observations are only theoretical. I assumed that DAC is asynchronous. Synchronous DACs, where computer controls timing supposed to be pretty bad since computer clock is very jittery.
The main problem is jitter. Clock jitter is a form of signal modulation (similar to FSK) that produces sidebands. These sidebands are at very low level but also very audible since they are not harmonically related to root frequency. Eventually with many frequencies (music) jitter produces a lot of sidebands resulting in noise that is proportional to signal level and hence undetectable without signal.
"My OP asked about need not preference for expensive USB cables. This could have been posted more clearly. Need is about functionality."

"Having said that, what is good enough? Is it the cable that costs another $50 or $500? I think the cable industry intentionally does us a disservice of keeping us in the dark in order to sell into that ignorance."

If you're just concerned about the function of USB, and things like maximum cable lengths before the signal starts to degrade, there really isn't any industry conspiracy. USB specs are published like any other format. Now if a private company wants to go out and make high end, expensive USB cables, who's to stop them? And why would someone stop them? (I'm assuming the people making these high end cables are not breaking any laws, such as copyright infringement, or similar type of offense.). So, as long as you stick to whatever standards the format requires, you shouldn't have any problems from a functionality standpoint, regardless of cost. Even a cheap cable should be fine. I found this website for you to look at. Its a business that sells cables, but they go over the requirements for the different versions of USB cables (USB 1, USB 2, USB 3, etc...)

http://www.yourcablestore.com/USB-Cable-Length-Limitations-And-How-To-Break-Them_ep_42-1.html