2. If the DAC is receiving a degraded waveform and the data recovery circuitry is working hard to sort through it, does this circuitry not increase the electrical noise in the DAC? If so, is it negligible or does that just depend on the DAC design?
It's a bit complicated because it's not like you have a usb data hooked up directly to DAC. You have a usb data that has to go to different hardware protocol before it gets to the actual DAC. And on top of that, you have the software on top of it to control everything.
This could be a long explanation especially if one may not have a background in electrical engineer.
Now there two different USB DAC architectures - synchronous and asynchronous. Synchronous was an old architecture which was susceptible to jitter and clock. Most DAC nowaday probably use asynchronous. This architecture essentially, at least in theory, eliminate the effects of jitter, which means even if the input USB data has a lot of jitter, the DAC will be affect. The detail why is a bit complicated.
Now my turn to pose the question (especially to wynpalmer4 since he claims to have a design background), because the asynchronous architecture essentially eliminates the effects of jitter, why then the USB cable would make any difference? (Assuming you still have bit perfect which is not that unreasonable)