Why does my DAC sound so much better after upgrading digital SPDIF cable?


I like my Mps5 playback designs sacd/CD player but also use it as a DAC so that I can use my OPPO as a transport to play 24-96 and other high res files I burn to dvd-audio discs.

I was using a nordost silver shadow digital spdif cable between the transport and my dac as I felt it was more transparent and better treble than a higher priced audioquest digital cable a dealer had me audition.

I recently received the Synergistic Research Galileo new SX UEF digital cable.  Immediately I recognized that i was hearing far better bass, soundstage, and instrument separation than I had ever heard with high res files (non sacd),

While I am obviously impressed with this high end digital cable and strongly encourage others to audition it, I am puzzled how the cable transporting digital information to my DAC from my transport makes such a big difference.

The DAC take the digital information and shapes the sound so why should the cable providing it the info be so important. I would think any competently built digital cable would be adequate....I get the cable from the DAC to the preamp and preamp to amp matter but would think the cable to the DAC would be much less important.

I will now experiment to see if using the external transport to send red book CD files to my playback mps5 sounds better than using the transport inside the mps5 itself.

The MPS5 sounds pretty great for ca and awesome with SACD so doubt external transport will be improvement for redhook cds


karmapolice
Also, here are some measurements of different Toslink cables: http://archimago.blogspot.com/2013/05/measurements-toslink-optical-audio.html?m=1

Including a freebie one, an intentionally poorly made one, an all glass one,etc.

Guess what? They are 99.9% identical, and jitter all below 100dBFS. So yeah, I stand further by my claim that the poster simply fell prone to placebo.


Worthless measurements.  Why? Because they are woefully insufficient to characterize the system, much less the jitter difference.

Most classical analog measurements are insufficient to show small differences in dynamics or soundstage.  This is a fact supported by hundreds of reviews in Stereophile where the measurements were very poor and yet the review with music was stellar.

The ONLY accurate way to make jitter measurements on a digital source is to do it directly, not through a DAC or analog system.  This requires a 5-10GHz B/W measurement system, not an AP.

Steve N.

Empirical Audio


Am I to understand that the cable is causing jitter to a flashing pulse of light and that the digital info being received by the amp is not totally correct? Or is jitter to do with extra devices and connections and distance between the source and the amp?

The info is correct, but the timing of the info is not optimum.  Everything adds to the jitter a little, the optical to electrical converters, the cable and every active device inside the components.  If the cable delivers a less than optimum signal, this will affect jitter because the receiver will have a slower risetime in reponse to the optical signal transitioning states.

Why don't you just try this excellent inexpensive cable and hear the difference:

https://btpa.com/TOSLINK-XXX.html

Steve N.

Empirical Audio

I’ve got a comparably modest digital source setup:

Allo Digione RPi
to Schiit Modi Uber 2
to NAD C316BEE
to Tekton Lore speakers

Did an a/b test with each cable upgrade:
no-name cheapo rca analog interconnect to ZU audio mission from dac to amp = huge improvements.

Then the same with digital rca: 
monoprice 1.5 meter to amplifier surgery 1.5 meter from Digione RPi to dac = huge improvements.


Here are jitter measurements of your Digione.  I also have one, not the latest premium version BTW:

https://www.audiocircle.com/index.php?PHPSESSID=g252r6cln0acu9kqv4f29356n6&topic=154299.0

Steve N.

Empirical Audio

Maybe the paper referenced below will shed some light on the complexity of jitter for those that believe that a simple value in some clock or DAC datasheet can Trump this effect for good.
Audibility of some forms of jitter on DACs and ADCs have been investigated, but I believe that the improvements that most of us hear when jitter is reduced tend to indicate that audibility thresholds are not so easy to define and greatly depend not only on the technology used inside the chips, but also on the implemented circuitry around those chips (e.g. power supply management).
My 2 cts worth, /patrick

https://statics.cirrus.com/pubs/whitePaper/WP_Specifying_Jitter_Performance.pdf


A good start.  I agree with:

"It follows that specs such as "Jitter 200 ps RMS" are

practically meaningless. Jitter specs should always

identify what measure of jitter they are referring to,

as in "Period jitter 200 ps RMS" for example."

All of my jitter measurements are direct and of the period.

"Period jitter was introduced in section 3.1.2. Unlike

wideband jitter and baseband jitter, it can be measured

directly in the time domain, i.e. without filter hardware.

You simply use a scope, and examine the waveform

one period after the trigger point. Many scopes can plot

period jitter histograms and extract RMS values."

I do not agree with this however:

"We saw in section 3.1.2 that period jitter is entirely

appropriate for some purposes. We see here that it is

entirely inappropriate as a general measure [14]. This

is because it is basically blind to low-frequency jitter."

This depends on the measurement system and how it measures the jitter.  Mine measures the jitter of the data, not the clock, so it factors in the fact that the period changes.  It selects one period and locks onto this.

"it can be useful to make N-period jitter

measurements with very large N. Modern digital scopes

are excellent for such measurements."

I do not believe my measurement system can do this easily, but it is important.

"A key point is that it is not just the basic audio signal

that gets modulated. It is everything that crosses the

boundary between the continuous-time domain and

sampled-signal domain. This can include out-of-band

interference (in ADCs), incompletely attenuated images

(in DACs), and "zero-input" internal signals such as

shaped quantization noise and class-D carriers."

"Even low-level components can

cause problems if they are up at high frequencies."

"Jitter bites equipment designers most deeply when it

causes a converter that should have more than 100 dB of

dynamic range to deliver e.g. only 80 dB. In such cases

the jitter is interacting not with the audio signal but with

an internal signal such as shaped quantization noise.

Early one-bit DACs were particularly sensitive to this.

More-recently the inclusion of switched-capacitor filters

and the move to multi-bit designs has eased things.

Above ~200 kHz, the quantization noise is largely white

at its point of injection. When you factor in the DAC's

sin(x)/x frequency response and the effect of the internal

switched-capacitor filter stage, its spectrum becomes

more like the upper trace in figure 10 (taken from [17]).

By applying the already-mentioned 6dB/octave tilt,

one can estimate the region of greatest jitter sensitivity.

It is typically somewhere around ~0.5 or ~1 MHz for

DACs that use high-order noise shaping."

"The jitter performance differences that we have seen

relate entirely to signal components that are above the

audio band."

So as you can see, the DAC itself is sensitive to jitter that is way out-of-audio band.

Steve N.

Empirical Audio

Am I right to think that gold connectors for digital cables are pointless?

The contacts are the important part in a BNC or RCA.  If both the shield and center conductor contacts are gold-plated, this is good enough.  Any non-oxidizing conductor material will do.  The shield is usually not gold-plated or having 360 degree contact unless you get a high-end connector, like the Neutrik BNC.

Steve N.

Empirical Audio