The ethernet cables send 1's and 0's (high or low voltage) only. They do so at about 0.8 times the speed of light. If your digital devices sends the message, 0011010110, and the digital receiver reads those numbers intact, it doesn't matter in any way what cable transmitted it.
The ethernet standard is to have no more than 1 faulty bit in 10,000,000,00 (1 part in 10^10). If the fidelity were not this high computer networks would be basically useless. Computers do not tolerate data errors. The ethernet protocol essentially guarantees complete data fidelity, If a data packet is misread, it is sent again until it is correct. With the error rate spec'd above, this would mean about one re-sent data point every hour, assuming 96 bit data depth and 40 kHz sampling (which is well beyond anything in the audio industry). The digital error rates in recording are many times higher, by the way.
In short answer, it is simply not physically possible for your ethernet cables to make a difference in sound quality.