JH's point regarding frequency is that a crystal clock is subject to transient fluctuations in output frequency caused by power supply variations & ripple, whereas the output frequency of a rubidium clock is inherently stable. He states that a transient variation in clock output frequency is "the very definition of jitter." Timing errors in the clock translate directly into a misshapen waveform and amplitude errors in the reconstructed analog signal, as well as introduction of spurious sideband frequencies unrelated harmonically to the original signal.
While I accept the benefits of ultra low-jitter clocks, I wonder whether some jitter in clock frequency is reintroduced in the clock-link cables that connect the G-ORb to the transport & DAC.
While I accept the benefits of ultra low-jitter clocks, I wonder whether some jitter in clock frequency is reintroduced in the clock-link cables that connect the G-ORb to the transport & DAC.