USB sucks


USB really isn‘t the right connection between DAC and Server: depending on cables used, you get very different sound quality if the server manages to recognise the DAC at all. Some time ago I replaced my highly tuned Mac Mini (by now-defunct Mach2mini, running Puremusic via USB) with an Innuos Zenith Mk3. For starters I couldn‘t get the DAC (Antelope Zodiac Gold) and server to recognise each other, transmission from the server under USB2.0 wasn‘t possible because the server is Linux based (mind, both alledgedly support the USB2.0 standard) and when I finally got them to talk to each other (by using Artisansilvercables (pure silver) the sound quality was ho-hum. While I understand the conceptual attraction to have the master clock near the converter under asynchronous USB, the connection‘s vagaries (need for exact 90 Ohms impedance, proneness to IFR interference, need to properly shield the 5v power line, short cable runs) makes one wonder, why one wouldn‘t do better to update I2S or S/PDIF or at the higher end use AES/EBU. After more than 20 years of digital playback, the wide variety of outcomes from minor changes seems unacceptable.

Since then and after a lot of playing around I have replaced the silver cables by Uptone USPCB rigid connectors, inserted an Intona Isolator 2.0 and Schiit EITR converting USB to S/PDIF. Connection to the DAC is via Acoustic Revive DSIX powered by a Kingrex LPS.

The amount of back and forth to make all this work is mindboggling, depending on choice of USB cables (with and without separate 5V connection, short, thick and God-knows what else) is hard to believe for something called a standard interface and the differences in sound quality make any review of USB products arbitrary verging on meaningless.

Obviously S/PDIF gives you no native PCM or DSD but, hey, most recordings still are redbook, anyway.
Conversely it is plug and play although quality of the cable still matters but finally it got me the sound quality I was looking for. It may not be the future but nor should USB, given all the shortcomings. Why is the industry promoting a standard that clearly isn‘t fit for purpose?

Finally, I invite the Bits-are-bits naysayers to go on a similar journey, it just might prove to be educational.
antigrunge2
I've never thought USB cables can make so much difference in a good way until I switched from 1 M Pangea Premiere SE to 1 FT Wireworld Silver Starlight 7.  Aurender N100 - PSA Direct Stream.
A lot of misinformation being spun here.  I think the title of the thread pretty much sums up the irrationality in so far as a sweeping generalization.  If I said McIntosh sucks, I suppose it would be due to my personal experience which is certainly valid as is the OP's umm, OP. 

Certainly if I was struggling with a certain technology with my gear it would be frustrating, I totally get it.  However, the time that USB has been around and matured is very evident and in many cases surpassed most other interfaces. That's not to say it doesn't have issues.  I think one has to approach such statements with some maturity and understanding of the technology.  There isn't one "connection" type that exists that someone at some point hasn't had an issue with their gear. 

If the designer and the implicit design intent is focused entirely on USB, or opt, or spdif, or Ethernet etc....chances are THAT interface will acell above any other in that particular product.  Its actually very apparent in numerous products and nowadays, USB pretty much dominates most product interfaces. 

Creating a really GOOD Ethernet interface in a DAC is very expensive.  The same goes for pretty much all other types > USB, i2s, opt etc... .  Sure you can load up a component with everything under the sun, but it's THE IMPLEMENTATION of said interface that matters.  This has been stated ad nauseam throughout most forums.  My .02



@steakster - I think what @rixthetrick was getting at is that any signal on a cable is inherently analog. While it represents digital data, the signal does not instantaneously transition from one voltage to another to represent 0s and 1s. The interface and cable must maintain good signal integrity to properly confer the digital data.

However, USB is a fairly robust interface. It’s designed to transfer digital data reliability in very low cost implementations. It can start to have problems with long cable lengths, but within reasonable limits, it does a great job of reliably transferring data and can easily handle the requirements of high resolution audio.

As has been pointed out, it is not optimized for minimum noise transfer between devices. It’s designed to be a reliable, inexpensive interface between digital devices. So care must be taken in the design and implementation of the server/streamer, interface cable, and DAC to minimize the effects of any electrical noise generated by the source or picked up along the way.

That doesn’t make it a bad interface. In most regards, all other digital audio interfaces have the exact same issues. It’s true that USB carries a power connection, but this can easily be ignored/dealt with by the DAC. The big advantage of USB (and Ethernet) over older digital audio interfaces (spdif/optical/AES3) is that they are asynchronous and therefore won’t introduce audio sample jitter into the mix.

Sure, you can put a lot of engineering effort and cost into the digital source to reduce jitter on these older interfaces (spdif, etc.), but the interfaces themselves make it impossible to achieve as good of results as the same effort/cost applied in the DAC itself where clocks and transmission line impedances are much easier to control.

There is certainly value in reducing the noise that is conveyed on the USB interface since this just makes the DACs job easier. Using a USB source device that has a good low-noise power supply and using a good cable can often help. This isn’t (or certainly shouldn’t) have any effect on the actual data that arrives at the DAC, but can reduce the amount of electrical noise that the DAC has to deal with.