Ethernet Cables, do they make a difference?


I stream music via TIDAL and the only cable in my system that is not an "Audiophile" cable is the one going from my Gateway to my PC, it is a CAT6 cable. Question is, do "Audiophile" Ethernet cables make any difference/ improvement in sound quality?

Any and all feedback is most appreciated, especially if you noted improvements in your streaming audio SQ with a High-End Ethernet cable.

Thanks!
grm
grm
Homes don’t have these challenges. So the shielded designs don’t help, but they could hurt if the shield ties end points to chassis and creates a ground loop. Floated shield would be fine however and the costs are minimal if it makes the audiophile feel better.
I agree that shielding is not needed in a home environment to assure that communications on the ethernet link are robust and reliable. However I wouldn’t rule out the possibility that it could make a difference with respect to RF noise that may be coupled **from** the cable **or** from the input circuit of the receiving device to circuit points that are downstream of the ethernet interface in the receiving device. Such as to D/A converter circuits, where timing jitter amounting to far less than one nanosecond is recognized as being audibly significant. (See the section entitled "Jitter Correlation to Audibility" near the end of this paper).

In addition to the effects of shielding on radiated emissions, shielding would presumably also affect the bandwidth, capacitance, and other characteristics of the cable, in turn affecting signal risetimes and falltimes (the amount of time it takes for the signals in the cable to transition between their two voltage states), in turn affecting the spectral composition of RF noise that may find its way past the ethernet interface in the receiving device. Also, small differences in waveform distortion that may occur on the rising and falling edges of the signals, as a result of less than perfect impedance matches, will affect the spectral composition of that noise while not affecting communication of the data.

When someone tee’s up a track in Tidal on their 100Mb/s cable modem and they pull the Ethernet cable and the song still plays what is actually happening from a cable perspective at that point?

Obviously noise that may find its way to circuitry of the receiving device that is downstream of its ethernet interface, as a consequence of the signal it is receiving, will be eliminated. On the other hand, airborne RFI may increase since the cable would no longer be connected to a termination that would absorb the signal energy. Which of those effects may have audible consequences, if in fact either of them does in some applications, as I indicated in my previous posts figures to be highly component and system dependent and to have little if any predictability.

I’ve personally had Ethernet cabling from $27 a foot to $233 a foot and compared directly to 315 foot of BerkTek CAT5e. No difference.

I don’t doubt your experience. However, I also don’t doubt experiences that have been reported by members such as DGarretson, Bryoncunningham, Grannyring, and others here who are similarly thorough when assessing a change.

Regards,
-- Al

jinjuku


I’ve personally had Ethernet cabling from $27 a foot to $233 a foot and compared directly to 315 foot of BerkTek CAT5e. No difference.
That’s interesting. Can you please tell us more about your comparison? Was it based solely on measurements? If you conducted listening tests, can you please tell us how they were conducted? Which specific cables did you evaluate?
Such as to D/A converter circuits, where timing jitter amounting to far less than one nanosecond is recognized as being audibly significant. (See the section entitled "Jitter Correlation to Audibility" near the end of this paper).

This is where we have a problem. Where in the playback system are you referring to this jitter? 

The entire point that, and I will keep with Tidal as example, is that once local buffer is filled up, and buffers are indeed static storage, that any timing variance ceases to exist. It's why I can watch Netflix 4K streamed with no issues. 

The fact of the matter, and it is indeed FACT, is if I pull the network cable for 1 second I've introduced  1000000000ns of jitter but some how the playback system has managed to deal with this and deal with it to the point that if you are blinded you couldn't tell me if your life depended on it. 

While this timing difference may be in the DA converter circuits, that's not the same as Ethernet which is burst in nature and asynch. 

In addition to the effects of shielding on radiated emissions, shielding would presumably also affect the bandwidth, capacitance, and other characteristics of the cable, in turn affecting signal risetimes and falltimes (the amount of time it takes for the signals in the cable to transition between their two voltage states), in turn affecting the spectral composition of RF noise that may find its way past the ethernet interface in the receiving device.

If your playback equipment is susceptible to standards compliant Ethernet cabling affecting playback I would say your equipment is defective. 

Again using Tidal: When I was at WGUtz place the 100', $13 cable allowed Tidal to cache the entire track just as quickly as the 15' boutique cable. 

Modern PHY's put the interface into a low or no power state when not transmitting. An 11 minute song was cached, in it's entirety, in about 15 seconds. 

Obviously noise that may find its way to circuitry of the receiving device that is downstream of its ethernet interface, as a consequence of the signal it is receiving, will be eliminated. On the other hand, airborne RFI may increase since the cable would no longer be connected to a termination that would absorb the signal energy. Which of those effects may have audible consequences, if in fact either of them does in some applications, as I indicated in my previous posts figures to be highly component and system dependent and to have little if any predictability.
Then everyone is screwed. I don't believe that to be the case. I can build a world class client / server setup for $700 to feed a DAC.

I don’t doubt your experience. However, I also don’t doubt experiences that have been reported by members such as DGarretson, Bryoncunningham, Grannyring, and others here who are similarly thorough when assessing a change.

You should doubt me and everyone else. The difference being I actually showed up at a members house and we went through this where the perceived changes disappeared once sighted bias was controlled for. 

I have no problem doing this elsewhere. 
That’s interesting. Can you please tell us more about your comparison? Was it based solely on measurements? If you conducted listening tests, can you please tell us how they were conducted? Which specific cables did you evaluate?

Nordost Heimdall 2 one meter $699.  BerkTek Hyper 5e, 98 meters, $90. 

Cary Audio DMS-500

RME UFX A/D duties fed to my laptop

ARTA for measurements. Nothing in spectral components, nothing in the 11.25Khz jitter, nothing in the linearity testing, nothing in the noise floor. I also did a cascade plot with my measurement mic. Nothing changed other than room ambient noises from one run to the next. This would even be slightly different using the same cable on multiple runs. 

I also captured the tracks and posted them elsewhere for people to download and evaluate. During playback I swapped out the cabling and simply asked for people to tell me when the cabling was swapped. Only a handful tried and all failed. 


Where in the playback system are you referring to this jitter?

The entire point that, and I will keep with Tidal as example, is that once local buffer is filled up, and buffers are indeed static storage, that any timing variance ceases to exist. It’s why I can watch Netflix 4K streamed with no issues.

The fact of the matter, and it is indeed FACT, is if I pull the network cable for 1 second I’ve introduced 1000000000ns of jitter but some how the playback system has managed to deal with this and deal with it to the point that if you are blinded you couldn’t tell me if your life depended on it.

While this timing difference may be in the DA converter circuits, that’s not the same as Ethernet which is burst in nature and asynch.
I am referring to timing jitter at the point of D/A conversion. And I am referring to the possibility that cable differences may affect the characteristics of RF noise that may bypass (i.e., find its way **around**) the ethernet interface, buffers, etc. and **to** the circuitry that performs D/A conversion.

Regarding disconnection of the cable, putting aside the possible significance of airborne RFI doing so would of course work in the direction of reducing noise that may be coupled from the input circuit to the point of D/A conversion. The 1,000,000,000 ns of jitter you referred to has no relevance to that.

Putting it all very basically, responses by those claiming ethernet cables won’t make a difference nearly always focus just on the intended/nominal signal path. The basic point to my earlier posts is that in real world circuitry parasitic/unintended signal paths also exist (via grounds, power supplies, parasitic capacitances, the air, etc.), which may allow RF noise to bypass the intended signal path to some extent, and therefore may account for some or many of the reported differences.

Regards,
-- Al