I am referring to timing jitter at the point of D/A conversion. And I am referring to the possibility that cable differences may affect the characteristics of RF noise that may bypass (i.e., find its way **around**) the ethernet interface, buffers, etc. and **to** the circuitry that performs D/A conversion.
At best that’s a design issue of the connected hardware. Not a cabling issue, IMO, where the cable meets or exceeds spec. The reason behind my thinking is that RF noise generated, by say impedance mismatch in a cable, is due more to length and twisted pairs not staying in mechanical balance than differences in 12-15 foot typical patch cable. Not to mention the horizontal run is most likely some junk CCA.
At worst, and if we take your interpretation, I would say from what I’ve seen, most of the incredibly expensive CATX cabling doesn’t pass IEEE / TIA spec and it introduces noise and audiophiles don’t understand what they are enjoying is a degradation of their playback chain. That’s a stretch for me though.
Bottom line it would be measurable as the outputs of a DAC are voltage output devices.
Regarding disconnection of the cable, putting aside the possible significance of airborne RFI doing so would of course work in the direction of reducing noise that may be coupled from the input circuit
Again this is measurable. Also why I like WiFi. It’s low latency, high throughput, no measurement (either instrumented or human) shows harmonic component’s of RFI frequencies showing up.
I think you just said that removing the plug from the back of the client would work in the direction of reducing noise... So with that said I would encourage a blinded evaluation session where the Ethernet cable is removed during playback of a track and the listener successfully is able to indicate that removal or insertion.
Let’s even paint a scenario where that’s actually the case. That noise component is most likely going to be buried in the noise floor (or a component of) the DAC, in the -130dB range on a competently designed piece of gear. You can’t hear anything that low even if it’s there. And if it’s not. It’s not.
Putting it all very basically, responses by those claiming ethernet cables won’t make a difference nearly always focus just on the intended/nominal signal path. The basic point to my earlier posts is that in real world circuitry parasitic signal paths also exist (via grounds, power supplies, parasitic capacitances, etc.), which may allow RF noise to bypass the intended signal path to some extent, and therefore may account for some or many of the reported differences.
Your response sounds like a guess. I have another theory for all the differences and it’s sighted bias or really poorly designed, often expensive, equipment.
I’ve been recommending either WiFi (ubiquiti) of wired (Intel Server PCIe) NIC’s (left and right they are available NIB or as new but pulled) for for ~$25. They seem impervious to what cabling I’ve thrown at it.