@svenjosh I missed the part where I said there is no noise in digital, of course there is. Crank up the ASA speed on a digital camera and you'll see plenty of it. One of the biggest differences between your cellphone camera and a serious digital camera is the bit depth of the sensor and thus dynamic range. This makes cellphone cameras much more sensitive to noise, especially quantization noise, as they have many more opportunities for values between bit levels. But none of this has anything to do with cables.
Within the digital domain any change in sound from a digital source must be accompanied by a corresponding change in the digital bitstream describing that sound. However, if that were introduced by a cable, that would change the checksum and/or parity of the bitstream and would be sensed as an error, and corrected whether by CRC or parity error on the receiving end. This is true for copper, fiber, or wireless Ethernet, AES/EBU, or USB.
Changes in sound - and their corresponding changes in bitstream are prevalent before and after transport at Layer One. Buffering, latency, jitter, all the side effects of various digital filter algorithms, introduction of noise from all kinds of sources, phase noise, quantum and thermal noise from electronics, noise carried in from imperfect grounding schemes, RF noise; in fiber there is quantum (shot or 1/f) noise, thermal noise. My personal favorite, is dark current noise (eery sounding innit?) But i digress, that's a camera sensor thing. None of these are caused by an in spec interconnect.
The point is any change be being attributed to the cable is being done so contrary to the basic operations of frame or packet based digital signal transmission. Those audible changes may be occurring, just not during transmission of the in-band bitstream.