I would refer you back to Hansen's white paper on clocking of data and how SPDIF clocks vs the USB Asynchrounous mode previously mentioned by Almarg.
The only problem with all of this is that some DAC chip designers have designs that are very tolerant of jitter because they already buffer the data onto the chip. IN the case of the Pico seconds of jitter as on SPDIF, the data is already de-jittered by the chip. This is what some video chip designers do.
So, the impact of jitter is not absolute but depends on the chip and implementation. Having said that, the best way around any issues with jitter is to buffer data and reclock it with a new master clock. As long as the device feeding the data (the computer in this instance) is able to keep up with the buffer requests, the DAC will never have to deal with a loss of data condition.