Two problems with that argument. bear in mind i design these things both in T&M and audio. The first problems i "not all DACs do a perfect re-clocking, and many, so as not to over/under run buffers begin with the input timing and then reduce timing variations". The second is, like many thigns in audio, listening tests tell me that some issues remain after competent isolation and re-clocking.
Now, i have posted several times that when i have built my own USB interface (isolation, power, FIFO, etc.) and applied them to legacy DACs, the dependence on a good input signal is far less. But its not zero. You can deduce what you wish - i don't have all the answers, but at least i listen, and then ask questions.
Do i think the Ethernet switches make a difference? No i don't. Its isolated anyway and are queued anyhow. (I supposed que handling might matter, i assume that is mapped by the router, maybe not). but clearly adding a bridge between server and endpoint helps, and a great USB interface helps, and providing a good input signal helps.
I do agree that most of the benefit comes from competent design of the USB interface. I also know that most designs are "data sheet engineered" and not ideal.
But be careful of blanket statements. Not only can they mislead given real world equipment, but they turn off people who have heard "its perfect" too many times before, only to find out otherwise (and have the industry fix issues, you know those silly guys at AD and Burr-Brown).
Thanks for clarifying. I assumed we were not 100% in agreement, though probably more yes than no.
G