Maybe the paper referenced below will shed some light on the complexity of jitter for those that believe that a simple value in some clock or DAC datasheet can Trump this effect for good.
Audibility of some forms of jitter on DACs and ADCs have been investigated, but I believe that the improvements that most of us hear when jitter is reduced tend to indicate that audibility thresholds are not so easy to define and greatly depend not only on the technology used inside the chips, but also on the implemented circuitry around those chips (e.g. power supply management).
My 2 cts worth, /patrick
https://statics.cirrus.com/pubs/whitePaper/WP_Specifying_Jitter_Performance.pdf
A good start. I agree with:
"It follows that specs such as "Jitter 200 ps RMS" are
practically meaningless. Jitter specs should always
identify what measure of jitter they are referring to,
as in "Period jitter 200 ps RMS" for example."
All of my jitter measurements are direct and of the period.
"Period jitter was introduced in section 3.1.2. Unlike
wideband jitter and baseband jitter, it can be measured
directly in the time domain, i.e. without filter hardware.
You simply use a scope, and examine the waveform
one period after the trigger point. Many scopes can plot
period jitter histograms and extract RMS values."
I do not agree with this however:
"We saw in section 3.1.2 that period jitter is entirely
appropriate for some purposes. We see here that it is
entirely inappropriate as a general measure [14]. This
is because it is basically blind to low-frequency jitter."
This depends on the measurement system and how it measures the jitter. Mine measures the jitter of the data, not the clock, so it factors in the fact that the period changes. It selects one period and locks onto this.
"it can be useful to make N-period jitter
measurements with very large N. Modern digital scopes
are excellent for such measurements."
I do not believe my measurement system can do this easily, but it is important.
"A key point is that it is not just the basic audio signal
that gets modulated. It is everything that crosses the
boundary between the continuous-time domain and
sampled-signal domain. This can include out-of-band
interference (in ADCs), incompletely attenuated images
(in DACs), and "zero-input" internal signals such as
shaped quantization noise and class-D carriers."
"Even low-level components can
cause problems if they are up at high frequencies."
"Jitter bites equipment designers most deeply when it
causes a converter that should have more than 100 dB of
dynamic range to deliver e.g. only 80 dB. In such cases
the jitter is interacting not with the audio signal but with
an internal signal such as shaped quantization noise.
Early one-bit DACs were particularly sensitive to this.
More-recently the inclusion of switched-capacitor filters
and the move to multi-bit designs has eased things.
Above ~200 kHz, the quantization noise is largely white
at its point of injection. When you factor in the DAC's
sin(x)/x frequency response and the effect of the internal
switched-capacitor filter stage, its spectrum becomes
more like the upper trace in figure 10 (taken from [17]).
By applying the already-mentioned 6dB/octave tilt,
one can estimate the region of greatest jitter sensitivity.
It is typically somewhere around ~0.5 or ~1 MHz for
DACs that use high-order noise shaping."
"The jitter performance differences that we have seen
relate entirely to signal components that are above the
audio band."
So as you can see, the DAC itself is sensitive to jitter that is way out-of-audio band.
Steve N.
Empirical Audio