I'm a Design Fellow Emeritus for Analog Devices and for a number of years I designed retiming PLL and DLL loops for various interfaces etc. including USB. The 1s and zeros that you refer to are generally caused by current changes into a fixed load at the receiver that matches the characteristic impedance of the "transmission line" that is the USB cable in order to minimize reflections.The data is also sent in packets retimed at the receiver. This retiming occurs "perfectly" as long as the transitions from 1 to 0 and back occur in a given period of time- a so called retiming window- the purpose of which is to eliminate the effect of the finite bandwidth and jitter of the interconnect/transmitting system. The receiver adds its own imperfections to the reconstructed data stream, but as long as the cable, transmitter and receiver comply to the standards the resulting data stream will have a one to one correspondence to the source. In addition, the data is checked upon reception using a CRC and if the packet is corrupted it will be dropped and the host can be asked to resend it. So, obviously, the system is not a dumb one and in reality- spec compliant cables CANNOT matter in an analog sense.
USB2.0 is also able to transmit 480Mb/s so audio bit streams are well below the max.
I've heard audio streams where the USB data is actually corrupted and packets are being lost.
The result is not subtle...