I don’t have an ethernet connection in my audio system, so I can’t answer that based on experience. However, assuming (as I do) that the several highly experienced and widely respected audiophiles who have reported realizing significant sonic benefits by changing from one inexpensive ethernet cable to a different inexpensive ethernet cable are correct, and if the explanation of those benefits that I hypothesized in my post in this thread dated 3-27-2017 is correct, the sound may or may not improve depending on the specific system.
As you will realize in reading that post, and **if** my hypothesis is correct, whether or not the sound improves would depend on the path(s) by which, and the degree to which, the signals in the cable reach and affect downstream circuit points that are ostensibly unrelated to the ethernet interface. It would also depend on how the content of the signal sent into the cable by the source component changes when the cable is disconnected, as a result of that component having nothing to talk to at the other end.
So flipping this on it's head: It's not the cable, it's the device.
I'm able to show via FFT that an $18 Intel NIC allows for 3 separate DACs to have un-altered output regardless of if I'm maxing out the driving voltage with a 315 foot generic cable or a 12 foot boutique cable.
Ethernet spec has 328 ft / 100m single segment length. We are talking lengths that are, I would guess, that are typically 10 foot or less in in 95% of the consumer installations.
So if really expensive equipment is actually susceptible to this then the EE's that are designing it don't know what they are actually doing. They may want to re-read Ott.