OK, I get what you were after, @dougthebiker ... The "square waves" defining the individual bits...
In fact they are not and do not need to be square at all. They "just" need to meet specified tolerance windows of high and low values inside a specified clock window.
Both wifi and ethernet signals need to meet this. Both wifi and wired routers and switches need to comply with these window specs. Good ones do, really bad ones may not.
What I was trying to explain is that an error in the level of one or more bits (1 instead of 0 or vice versa) does not create distortion, it creates a numerical error vs. the check bits which either can be corrected or results in a drop or buffering or really obvious noise, not harmonic distortion or amplitude or phase shift.
Of course wifi is susceptible to range and strength and bandwidth/speed issues. The point is that once you have a properly set up network with decent equipment and the signal is stable (sufficient speed and strength), the signal you will get via wifi will be identical to the one you get via ethernet.
Low quality or too long ethernet cables can also cause your bit signal to shift outside of the tolerance windows and cause transmission errors due to out of spec resistance, capacitance or inductance.
Think of streaming HD TV on your wifi Amazon Fire Stick vs. via digital cable. If you have enough bandwidth and signal strength, the picture on the same TV will be the same. The colors will be the same, the resolution will be the same, etc. The one on the Fire Stick will not be "distorted".
Same thing with audio. I took care to specify and locate my existing router and streamer such that there is no buffering or drops. I now need at minimum to replicate this. To my knowledge there are indeed subtle, incremental sound quality improvement opportunities in better routers due to clocking accuracy and electrical noise, and that’s the reason for my original post. But that applies to both ethernet and wifi.
I believe we beat this horse to death.
Cheers!