The Levinson DAC provides SPDIF, AES/EBU, and Toslink input (and ATT, I believe) interfaces. It does not support other "custom" interfaces such as I2ES, which attempt to deliver accurate, jitter-free clock signals to the DAC. Problems with the transport such as dropped data bits, etc., is a separate issue from the effect of digital cables. A back-of-the envelope calculation reveals that clock jitter needs to be less than 100 picoseconds to ensure that all jitter artifacts are at least 96 dB down from a signal at 20 kHz (the most stressing case). Bob Harley notes in his book "Guide to High End Audio" that several popular clock recovery chips, based on PLLs, produce about 3-4 nanoseconds of jitter (about 16 dB worse than the 100 ps reference point). Delivering a 44.1 kHz clock signal to a DAC and assuring jitter less than 100 ps has a stability of about 4 parts per million, which is achievable with competent design. Devices such as a Digital Time Lens, which provide an SPDIF output, remain at the mercy of clock recovery jitter at the DAC. The best they can hope to do is to stabilize the transmit end clocking as seen by the clock recovery circuit.
Why do digital cables sound different?
I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
- ...
- 291 posts total
- 291 posts total