@steakster - I think what @rixthetrick was getting at is that any signal on a cable is inherently analog. While it represents digital data, the signal does not instantaneously transition from one voltage to another to represent 0s and 1s. The interface and cable must maintain good signal integrity to properly confer the digital data.
However, USB is a fairly robust interface. It’s designed to transfer digital data reliability in very low cost implementations. It can start to have problems with long cable lengths, but within reasonable limits, it does a great job of reliably transferring data and can easily handle the requirements of high resolution audio.
As has been pointed out, it is not optimized for minimum noise transfer between devices. It’s designed to be a reliable, inexpensive interface between digital devices. So care must be taken in the design and implementation of the server/streamer, interface cable, and DAC to minimize the effects of any electrical noise generated by the source or picked up along the way.
That doesn’t make it a bad interface. In most regards, all other digital audio interfaces have the exact same issues. It’s true that USB carries a power connection, but this can easily be ignored/dealt with by the DAC. The big advantage of USB (and Ethernet) over older digital audio interfaces (spdif/optical/AES3) is that they are asynchronous and therefore won’t introduce audio sample jitter into the mix.
Sure, you can put a lot of engineering effort and cost into the digital source to reduce jitter on these older interfaces (spdif, etc.), but the interfaces themselves make it impossible to achieve as good of results as the same effort/cost applied in the DAC itself where clocks and transmission line impedances are much easier to control.
There is certainly value in reducing the noise that is conveyed on the USB interface since this just makes the DACs job easier. Using a USB source device that has a good low-noise power supply and using a good cable can often help. This isn’t (or certainly shouldn’t) have any effect on the actual data that arrives at the DAC, but can reduce the amount of electrical noise that the DAC has to deal with.
However, USB is a fairly robust interface. It’s designed to transfer digital data reliability in very low cost implementations. It can start to have problems with long cable lengths, but within reasonable limits, it does a great job of reliably transferring data and can easily handle the requirements of high resolution audio.
As has been pointed out, it is not optimized for minimum noise transfer between devices. It’s designed to be a reliable, inexpensive interface between digital devices. So care must be taken in the design and implementation of the server/streamer, interface cable, and DAC to minimize the effects of any electrical noise generated by the source or picked up along the way.
That doesn’t make it a bad interface. In most regards, all other digital audio interfaces have the exact same issues. It’s true that USB carries a power connection, but this can easily be ignored/dealt with by the DAC. The big advantage of USB (and Ethernet) over older digital audio interfaces (spdif/optical/AES3) is that they are asynchronous and therefore won’t introduce audio sample jitter into the mix.
Sure, you can put a lot of engineering effort and cost into the digital source to reduce jitter on these older interfaces (spdif, etc.), but the interfaces themselves make it impossible to achieve as good of results as the same effort/cost applied in the DAC itself where clocks and transmission line impedances are much easier to control.
There is certainly value in reducing the noise that is conveyed on the USB interface since this just makes the DACs job easier. Using a USB source device that has a good low-noise power supply and using a good cable can often help. This isn’t (or certainly shouldn’t) have any effect on the actual data that arrives at the DAC, but can reduce the amount of electrical noise that the DAC has to deal with.