USB sucks


USB really isn‘t the right connection between DAC and Server: depending on cables used, you get very different sound quality if the server manages to recognise the DAC at all. Some time ago I replaced my highly tuned Mac Mini (by now-defunct Mach2mini, running Puremusic via USB) with an Innuos Zenith Mk3. For starters I couldn‘t get the DAC (Antelope Zodiac Gold) and server to recognise each other, transmission from the server under USB2.0 wasn‘t possible because the server is Linux based (mind, both alledgedly support the USB2.0 standard) and when I finally got them to talk to each other (by using Artisansilvercables (pure silver) the sound quality was ho-hum. While I understand the conceptual attraction to have the master clock near the converter under asynchronous USB, the connection‘s vagaries (need for exact 90 Ohms impedance, proneness to IFR interference, need to properly shield the 5v power line, short cable runs) makes one wonder, why one wouldn‘t do better to update I2S or S/PDIF or at the higher end use AES/EBU. After more than 20 years of digital playback, the wide variety of outcomes from minor changes seems unacceptable.

Since then and after a lot of playing around I have replaced the silver cables by Uptone USPCB rigid connectors, inserted an Intona Isolator 2.0 and Schiit EITR converting USB to S/PDIF. Connection to the DAC is via Acoustic Revive DSIX powered by a Kingrex LPS.

The amount of back and forth to make all this work is mindboggling, depending on choice of USB cables (with and without separate 5V connection, short, thick and God-knows what else) is hard to believe for something called a standard interface and the differences in sound quality make any review of USB products arbitrary verging on meaningless.

Obviously S/PDIF gives you no native PCM or DSD but, hey, most recordings still are redbook, anyway.
Conversely it is plug and play although quality of the cable still matters but finally it got me the sound quality I was looking for. It may not be the future but nor should USB, given all the shortcomings. Why is the industry promoting a standard that clearly isn‘t fit for purpose?

Finally, I invite the Bits-are-bits naysayers to go on a similar journey, it just might prove to be educational.
antigrunge2
I was writing about the infinite resolution statement. While voltage levels from say 0 to 2 volts may have infinitely possible levels (I'm not even sure of this due to the quantum nature of the universe) being able to read those infinitely small changes (resolution) can't be done due to external noise entering the wire. Even background cosmic radiation will limit measurable differences. 
There’s at least one clown on this thread who has no business saying anything about audio. You figure out who. He goes by theory and not his ears, if he's ever really listened at all.

My experience...
Cables make a huge difference for sure.
External clockers make a huge difference.

We’re not talking about hooking up a USB printer.

Jitter is the plague of digital audio. It can come from transports and servers. There are devices that go in the audio path to clean it up.

How do I know all this? I use my ears.




USB for a DAC??? Never.
I guess you realize that dogmatic statements like this without any context and nothing to back them up are basically worthless and add nothing to the conversation?
For the most part, but people like what they like and are free to do so, nothing wrong with that. 

This thread has basically dissolved itself.  To say "X" interface is __ means nothing without context and shows a typical follower knee jerk reaction to what others say.  This hobby has sooo much of this pile on mentality. 

Although, it isn't (and shouldn't be) a requirement to have an engineering degree to enjoy / understand every piece of gear.  However, short of being a plug and play person (which is totally fine), if one chooses to get a bit more into the technical side of things, it behooves the individual to research AND experiment to get a better understanding of whats going on and to determine what best fits THEIR environment /setup

Every product is different and usually has one or two interfaces that work best from the design intent (many have significant differences) - its not really that hard to understand or accept.
Far from having ‘dissolved itself‘ I sincerely hope that this thread might lead serious designers to reconsider whether rather than using a low end, convenience consumer interface with all its known foibles to transmit high quality audio, one might usefully revisit more appropriate formats (optical, I2S, AES/EBU) to improve on what is at best an unacceptably wide range of outcomes with USB; I also note with a degree of puzzlement that members of the ‘bits are bits’ school of sitting on your ears are alive and well
There are 2 reliable interfaces capable of transmitting DSD , ethernet and USB. If you're not interested then use optical, coax, or AES3. I2S was never intended for transfer between devices but on chip no more than an inch or two. There are a few DACs and streamers that have I2S just need to make sure what you're connecting to what is compatible. As far as jitter AES3 and coax will have more jitter than USB in well designed DACs. It's just the nature of the design, having the clock decided by the host like coax and AES3 causes more jitter than asynchronous transfer like USB. There's no puzzlement or spooky things at a distance bits are bits whether those bits are transferred by ethernet, USB, optical, coax or  AES3 it's the same bits.