Ethernet Cables, do they make a difference?


I stream music via TIDAL and the only cable in my system that is not an "Audiophile" cable is the one going from my Gateway to my PC, it is a CAT6 cable. Question is, do "Audiophile" Ethernet cables make any difference/ improvement in sound quality?

Any and all feedback is most appreciated, especially if you noted improvements in your streaming audio SQ with a High-End Ethernet cable.

Thanks!
grm
grm
That’s interesting. Can you please tell us more about your comparison? Was it based solely on measurements? If you conducted listening tests, can you please tell us how they were conducted? Which specific cables did you evaluate?

Nordost Heimdall 2 one meter $699.  BerkTek Hyper 5e, 98 meters, $90. 

Cary Audio DMS-500

RME UFX A/D duties fed to my laptop

ARTA for measurements. Nothing in spectral components, nothing in the 11.25Khz jitter, nothing in the linearity testing, nothing in the noise floor. I also did a cascade plot with my measurement mic. Nothing changed other than room ambient noises from one run to the next. This would even be slightly different using the same cable on multiple runs. 

I also captured the tracks and posted them elsewhere for people to download and evaluate. During playback I swapped out the cabling and simply asked for people to tell me when the cabling was swapped. Only a handful tried and all failed. 


Where in the playback system are you referring to this jitter?

The entire point that, and I will keep with Tidal as example, is that once local buffer is filled up, and buffers are indeed static storage, that any timing variance ceases to exist. It’s why I can watch Netflix 4K streamed with no issues.

The fact of the matter, and it is indeed FACT, is if I pull the network cable for 1 second I’ve introduced 1000000000ns of jitter but some how the playback system has managed to deal with this and deal with it to the point that if you are blinded you couldn’t tell me if your life depended on it.

While this timing difference may be in the DA converter circuits, that’s not the same as Ethernet which is burst in nature and asynch.
I am referring to timing jitter at the point of D/A conversion. And I am referring to the possibility that cable differences may affect the characteristics of RF noise that may bypass (i.e., find its way **around**) the ethernet interface, buffers, etc. and **to** the circuitry that performs D/A conversion.

Regarding disconnection of the cable, putting aside the possible significance of airborne RFI doing so would of course work in the direction of reducing noise that may be coupled from the input circuit to the point of D/A conversion. The 1,000,000,000 ns of jitter you referred to has no relevance to that.

Putting it all very basically, responses by those claiming ethernet cables won’t make a difference nearly always focus just on the intended/nominal signal path. The basic point to my earlier posts is that in real world circuitry parasitic/unintended signal paths also exist (via grounds, power supplies, parasitic capacitances, the air, etc.), which may allow RF noise to bypass the intended signal path to some extent, and therefore may account for some or many of the reported differences.

Regards,
-- Al

I purchase a 25ft Supra Cat 8 about 6 mos ago. I installed it for a bit and did not really notice an uptick in Sq. So I took it out as I did not feel like getting under the house to do a permanent install. I recently up dated my cable loom to the Audience Au 24Sx series. I re installed the Supra after a couple mos of breaking in the Audience and a very nice bump in Sq. Deeper soundstage, more defined percussion, better high end. Pretty easy to hear the improvement. 
So the answer is if you have a very resolving system an upgraded Ethernet cable will definitely help.  Just proved it in my system. 
I am referring to timing jitter at the point of D/A conversion. And I am referring to the possibility that cable differences may affect the characteristics of RF noise that may bypass (i.e., find its way **around**) the ethernet interface, buffers, etc. and **to** the circuitry that performs D/A conversion.

At best that’s a design issue of the connected hardware. Not a cabling issue, IMO, where the cable meets or exceeds spec. The reason behind my thinking is that RF noise generated, by say impedance mismatch in a cable, is due more to length and twisted pairs not staying in mechanical balance than differences in 12-15 foot typical patch cable. Not to mention the horizontal run is most likely some junk CCA.

At worst, and if we take your interpretation, I would say from what I’ve seen, most of the incredibly expensive CATX cabling doesn’t pass IEEE / TIA spec and it introduces noise and audiophiles don’t understand what they are enjoying is a degradation of their playback chain. That’s a stretch for me though.

Bottom line it would be measurable as the outputs of a DAC are voltage output devices.

Regarding disconnection of the cable, putting aside the possible significance of airborne RFI doing so would of course work in the direction of reducing noise that may be coupled from the input circuit
Again this is measurable. Also why I like WiFi. It’s low latency, high throughput, no measurement (either instrumented or human) shows harmonic component’s of RFI frequencies showing up.

I think you just said that removing the plug from the back of the client would work in the direction of reducing noise... So with that said I would encourage a blinded evaluation session where the Ethernet cable is removed during playback of a track and the listener successfully is able to indicate that removal or insertion.

Let’s even paint a scenario where that’s actually the case. That noise component is most likely going to be buried in the noise floor (or a component of) the DAC, in the -130dB range on a competently designed piece of gear. You can’t hear anything that low even if it’s there. And if it’s not. It’s not.

Putting it all very basically, responses by those claiming ethernet cables won’t make a difference nearly always focus just on the intended/nominal signal path. The basic point to my earlier posts is that in real world circuitry parasitic signal paths also exist (via grounds, power supplies, parasitic capacitances, etc.), which may allow RF noise to bypass the intended signal path to some extent, and therefore may account for some or many of the reported differences.

Your response sounds like a guess. I have another theory for all the differences and it’s sighted bias or really poorly designed, often expensive, equipment.

I’ve been recommending either WiFi (ubiquiti) of wired (Intel Server PCIe) NIC’s (left and right they are available NIB or as new but pulled) for for ~$25. They seem impervious to what cabling I’ve thrown at it.


@almarg   Thanks, as always for sharing your thoughts on these matters. +1 on this and your other posts in the thread.

@benzman  Thanks for sharing your findings. You make an important point. I've also had your Audience SX and the Supra CAT8 in my system and concur with your results.