Such as to D/A converter circuits, where timing jitter amounting to far less than one nanosecond is recognized as being audibly significant. (See the section entitled "Jitter Correlation to Audibility" near the end of this paper).
This is where we have a problem. Where in the playback system are you referring to this jitter?
The entire point that, and I will keep with Tidal as example, is that once local buffer is filled up, and buffers are indeed static storage, that any timing variance ceases to exist. It's why I can watch Netflix 4K streamed with no issues.
The fact of the matter, and it is indeed FACT, is if I pull the network cable for 1 second I've introduced
1000000000ns of jitter but some how the playback system has managed to deal with this and deal with it to the point that if you are blinded you couldn't tell me if your life depended on it.
While this timing difference may be in the DA converter circuits, that's not the same as Ethernet which is burst in nature and asynch.
In addition to the effects of shielding on radiated emissions, shielding would presumably also affect the bandwidth, capacitance, and other characteristics of the cable, in turn affecting signal risetimes and falltimes (the amount of time it takes for the signals in the cable to transition between their two voltage states), in turn affecting the spectral composition of RF noise that may find its way past the ethernet interface in the receiving device.
If your playback equipment is susceptible to standards compliant Ethernet cabling affecting playback I would say your equipment is defective.
Again using Tidal: When I was at WGUtz place the 100', $13 cable allowed Tidal to cache the entire track just as quickly as the 15' boutique cable.
Modern PHY's put the interface into a low or no power state when not transmitting. An 11 minute song was cached, in it's entirety, in about 15 seconds.
Obviously noise that may find its way to circuitry of the receiving device that is downstream of its ethernet interface, as a consequence of the signal it is receiving, will be eliminated. On the other hand, airborne RFI may increase since the cable would no longer be connected to a termination that would absorb the signal energy. Which of those effects may have audible consequences, if in fact either of them does in some applications, as I indicated in my previous posts figures to be highly component and system dependent and to have little if any predictability.
Then everyone is screwed. I don't believe that to be the case. I can build a world class client / server setup for $700 to feed a DAC.
I don’t doubt your experience. However, I also don’t doubt experiences that have been reported by members such as DGarretson, Bryoncunningham, Grannyring, and others here who are similarly thorough when assessing a change.
You should doubt me and everyone else. The difference being I actually showed up at a members house and we went through this where the perceived changes disappeared once sighted bias was controlled for.
I have no problem doing this elsewhere.