Does the quality of a digital signal matter?


I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.

I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.

An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.

I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.

I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
mceljo
The points Kijanki made about timing, jitter, and reflections on impedance boundaries merit added emphasis and explanation, imo.

The S/PDIF and AES/EBU interfaces which are most commonly used to transmit data from transport to dac are inherently prone to jitter, meaning short-term random fluctuations in the amount of time between each of the 44,100 samples which are converted by the dac for each channel in each second (for redbook cd data).

As Kijanki stated, "Jitter creates sidebands at very low level (in order of <-60dB) but audible since not harmonically related to root frequency. With music (many frequencies) it means noise. This noise is difficult to detect because it is present only when signal is present thus manifest itself as a lack of clarity."

One major contributor to jitter is electrical noise that will be riding on the digital signal. Another is what are called vswr (voltage standing wave ratio) effects, that come into play at high frequencies (such as the frequency components of digital audio signals), which result in reflection back toward the source of some of the signal energy whenever an impedance match (between connectors, cables, output circuits, and input circuits) is less than perfect.

Some fraction of the signal energy that is reflected back from the dac input toward the transport output will be re-reflected from the transport output or other impedance discontinuity, and arrive at the dac input at a later time than the originally incident waveform, causing distortion of the waveform. Whether or not that distortion will result in audibly significant jitter, besides being dependent on the amplitude of the re-reflections, is very much dependent on what point on the original waveform their arrival coincides with.

Therefore the LENGTH of the connecting cable can assume major importance, conceivably much more so than the quality of the cable. And in this case, shorter is not necessarily better. See this paper, which as an EE strikes me as technically plausible, and which is also supported by experimental evidence from at least one member here whose opinions I respect:

http://www.positive-feedback.com/Issue14/spdif.htm

Factors which determine the significance of these effects, besides cable length and quality, include the risetime and falltime of the output signal of the particular transport, the jitter rejection capabilities of the dac, the amount of electrical noise that may be generated by and picked up from other components in the system, ground offsets between the two components; the value of the logic threshold for the digital receiver chip at the input of the dac; the clock rate of the data (redbook or high rez), the degree of the impedance mismatches that are present, and many other factors.

Also, keep in mind that what we are dealing with is an audio SYSTEM, the implication being that components can interact in ways that are non-obvious and that do not directly relate to the signal path that is being considered.

For instance, physical placement of a digital component relative to analog components and cables, as well as the ac power distribution arrangement, can affect coupling of digital noise into analog circuit points, with unpredictable effects. Digital signals have substantial radio frequency content, which can couple to other parts of the system through cables, power wiring, and the air.

All of which adds up to the fact that differences can be expected, but does NOT necessarily mean that more expensive = better.

Regards,
-- Al

P.S: I am also an EE, in my case having considerable experience designing high speed a/d and d/a converter circuits for non-audio applications.

I may be over simplifying this a bit, but it sounds like the proximity of the components that "read" the CD can have an effect on the analog signal created in the DAC. Would this be justification for a completely seperate DAC?

How does this relate to a Toslink cable that is optical?
Jitter is not a problem with "digital" part of digital (the robust part). Jitter is part of the analog problem with digital and can be regarded as a D to A problem (or, in the studio an A to D problem). It is an analog timing problem whereby distortion can be introduced at the DAC/ADC stage because of drift in the clock. To accurately convert digital to analog or analog to digital requires an extremely accurate clock.

I stand by my statement that you can copy a copy of a digital signal and repeat the copy of each subsequent copy 1000's of times with no degradation.

You cannot do this with any analog media - within ten to twenty copies or a copy the degradation becomes extremely audible (or visible in the case of a VHS cassette)

The evidence is that digital signals are extremely robust compared to analog.
Almarg - Here is a response from my EE friend that I've been discussion this topic with at work.

"One of the most important factors discussed is "the value of the logic threshold for the digital receiver chip at the input of the dac" which, and this is important, supersedes ALL OTHERS in properly designed electronic equipment. If it didn't, the computer you are typing on would not work, the key-strokes would get lost, data you receive over the internet would be incomplete, pixels would be missing from the image in your video screen--ALL of which operate at WAY higher frequencies than any CD audio signal. Compared to modern computers, digital audio is simply rudimentary. If the audio equipment cannot transmit or identify logic signals that are above the background noise (all other elements discussed fall into this category) than the equipment in question is simply junk. I could, in the digital electronics lab at school, design and build a digital data transmission device and associated data receiver that would operate at 1MHz (far above any audio signal, but low frequency for digital electronics) and not lose a single bit of data.

Again, everything mentioned is real and true, but IS NOT A FACTOR in properly designed and built equipment. It is FAR more applicable to things like cell phone and computer design, and if the electronics industry were unable to overcome all the factors discussed in mere audio equipment, then a working cell phone and 3GHz processor would simply be pipe dreams.

As far as the SPDIF issue addressed in the linked article is concerned, it too is correct, but not a factor in your system. If you think it might be, switch to an optical cable or HDMI and see if you can hear a difference. I bet not. The information getting to the DAC in your amplifier will be bit for bit identical. If not, you have broken equipment."