Where in the playback system are you referring to this jitter?I am referring to timing jitter at the point of D/A conversion. And I am referring to the possibility that cable differences may affect the characteristics of RF noise that may bypass (i.e., find its way **around**) the ethernet interface, buffers, etc. and **to** the circuitry that performs D/A conversion.
The entire point that, and I will keep with Tidal as example, is that once local buffer is filled up, and buffers are indeed static storage, that any timing variance ceases to exist. It’s why I can watch Netflix 4K streamed with no issues.
The fact of the matter, and it is indeed FACT, is if I pull the network cable for 1 second I’ve introduced 1000000000ns of jitter but some how the playback system has managed to deal with this and deal with it to the point that if you are blinded you couldn’t tell me if your life depended on it.
While this timing difference may be in the DA converter circuits, that’s not the same as Ethernet which is burst in nature and asynch.
Regarding disconnection of the cable, putting aside the possible significance of airborne RFI doing so would of course work in the direction of reducing noise that may be coupled from the input circuit to the point of D/A conversion. The 1,000,000,000 ns of jitter you referred to has no relevance to that.
Putting it all very basically, responses by those claiming ethernet cables won’t make a difference nearly always focus just on the intended/nominal signal path. The basic point to my earlier posts is that in real world circuitry parasitic/unintended signal paths also exist (via grounds, power supplies, parasitic capacitances, the air, etc.), which may allow RF noise to bypass the intended signal path to some extent, and therefore may account for some or many of the reported differences.
Regards,
-- Al