Why does USB feature so much in discussions about DACs when the newer HDMI seems better?


I am a bit confused about the frequent mention of USB in the context of stand-alone Digital to Analog Converters (DAC).  Why is HDMI left out?  Is this a US versus Europe / Asia thing?

The Universal Serial Bus (USB) was introduced in 1996 by a group of computer manufacturers primarily to support plug-and-play for peripherals like keyboards and printers.  It has only two signal wires, plus two wires that can supply DC power.

The High-Definition Multimedia Interface (HDMI) was specifically designed by a group of television manufacturers for transmitting digital audio and video in many formats.  It hit the shops around 2004.  There are 19 pins supporting four shielded twisted pairs, and seven other wires (3 of which can instead form a shielded twisted pair for Ethernet).

I have three universal disk players from Sony, Panasonic and Reavon, which all have two HDMI outputs, one can be dedicated to audio only, the other carries video or video plus audio.  (Only the Panasonic does not support SACD).  My Marantz AV 8802 pre-processor has 11 HDMI connections and only two USBs.

Of course, both USB and HDMI continue to evolve.  Then there is the Media-Oriented System Transport (MOST) bus designed by the automotive industry, which looks even better.

Why is it so?

128x128richardbrand

Here I am quoting from a respected DAC designer:

OK...see if this makes any sense to you: if clocking gets corrupted with a single channel traveling on one wire then how would it make any sense to attempt to coordinate three separate clocks on three separate wires?

It makes no sense.

If I2S was actually better they would be using it in recording studios and they most certainly do not.

If I2S was actually better then nearly every company in the audiophile industry would be promoting it and they most certainly do not.

There are a small group of Chi-Fi manufactures who started promoting I2S and the audio-fools bought into it hook-line-and-sinker.

If I2S sounds better in a specific DAC it is only because the other digital inputs on that DAC are lacking, not because I2S is inherently better.

The above excerpt is from pg 7 of this excellent thread.

 

Really, it is puzzling to see audiophiles bicker over various types of digital connections that are either deprecated (spdif/toslink/coax), never meant to operate over cable (I2S), or that can be made to work well by jumping through a thousand hoops (USB).

By the way, why are audiophiles still stuck in USB 2.x? The 3.x revision has nine wires (vs. 4), enough to carry more clocks than a train station if that's what you think will help sound quality. 

Meanwhile, AoIP - Audio over IP - is a thing, and has been for years. Runs flawlessly over standard RJ-45 or better yet, SFP. Dante and Ravenna are well known AoIP implementations popular in pro audio. But to my limited knowledge there has been a grand total of one Ravenna-enabled audiophile DAC. Why?

 

@devinplombier - very, very few people understand how digital interfaces work. It is easier to understand spdif/toslink with all its jitter and whatnot, but explain USB or Ethernet... good luck. I often feel people believe there is an actual "stream" of bits between Tidal server and their device...

@sns HDMI natively supports pure audio - have a look at my earlier post where I list what it can do.  It is a very impressive list, certainly way beyond I2S.

Now, HDMI connectors are attractive for other uses.  I have a camera system in my motorhome which supports four analogue TV cameras feeding a single display. Dometic, who makes this system, choose to use HDMI connectors to carry the analogue TV signals.  Despite there being a standard HDMI connector for in-vehicle use, Dometic use the totally unsuitable consumer connector!  Hopeless.

I had never heard of I2S when I started this thread, but I can understand how HDMI with its 19 connection pins would provide an easily available way to transmit four signals.  But it certainly is not native HDMI.  There are 10s of billions of HDMI connectors in the world today.

@devinplombier I am equally staggered! 

You did not mention the latest incarnation of USB which has a connection count of 24 wires!  The connection topography is shared with several other technologies including Thunderbolt!

The naming and description of this latest USB evolution (revolution?) is confusing in the extreme.

My understanding is that for use of the latest features, the cable itself needs to be active.  That is, it has to have embedded logic chips.  That's how it can carry HDMI, which itself can carry Ethernet.

@mikhailark Absolutely spot on!  On another forum, an audiophile claimed there could be no difference audible between CAT5 and CAT6 Ethernet cabling, because audio frequencies are much lower than Ethernet transmission rates.  Anyone who does not get the point, does not understand how audio is carried over packet switched networks.

By the way, Ethernet on its own does not guarantee that a packet will actually be delivered, nor does it guarantee how long it may take to transmit a packet.  To probe why, take a trip to the Hawaiian Islands and try to understand the ALOHA radio data system which was the genesis of Ethernet.  The secret is in the CSMA/CD acronym - Carrier Sense, Multiple Access / Collision Detection.

The internet is also packet switched technology, which is evolving mainly through Requests For Comment!  Audio over IP (Internet Protocol) uses packetisation, whatever the physical wire arranement.