Does the quality of a digital signal matter?


I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.

I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.

An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.

I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.

I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
mceljo
Jitter is not a problem with "digital" part of digital (the robust part). Jitter is part of the analog problem with digital and can be regarded as a D to A problem (or, in the studio an A to D problem). It is an analog timing problem whereby distortion can be introduced at the DAC/ADC stage because of drift in the clock. To accurately convert digital to analog or analog to digital requires an extremely accurate clock.

I stand by my statement that you can copy a copy of a digital signal and repeat the copy of each subsequent copy 1000's of times with no degradation.

You cannot do this with any analog media - within ten to twenty copies or a copy the degradation becomes extremely audible (or visible in the case of a VHS cassette)

The evidence is that digital signals are extremely robust compared to analog.
Almarg - Here is a response from my EE friend that I've been discussion this topic with at work.

"One of the most important factors discussed is "the value of the logic threshold for the digital receiver chip at the input of the dac" which, and this is important, supersedes ALL OTHERS in properly designed electronic equipment. If it didn't, the computer you are typing on would not work, the key-strokes would get lost, data you receive over the internet would be incomplete, pixels would be missing from the image in your video screen--ALL of which operate at WAY higher frequencies than any CD audio signal. Compared to modern computers, digital audio is simply rudimentary. If the audio equipment cannot transmit or identify logic signals that are above the background noise (all other elements discussed fall into this category) than the equipment in question is simply junk. I could, in the digital electronics lab at school, design and build a digital data transmission device and associated data receiver that would operate at 1MHz (far above any audio signal, but low frequency for digital electronics) and not lose a single bit of data.

Again, everything mentioned is real and true, but IS NOT A FACTOR in properly designed and built equipment. It is FAR more applicable to things like cell phone and computer design, and if the electronics industry were unable to overcome all the factors discussed in mere audio equipment, then a working cell phone and 3GHz processor would simply be pipe dreams.

As far as the SPDIF issue addressed in the linked article is concerned, it too is correct, but not a factor in your system. If you think it might be, switch to an optical cable or HDMI and see if you can hear a difference. I bet not. The information getting to the DAC in your amplifier will be bit for bit identical. If not, you have broken equipment."
Mceljo, with all due respect your friend seems to have missed my point.

My point was NOT that bit errors would occur in the link between transport and dac, due to logic threshold problems or due to any other reason. I would expect that any such interface that is not defective, and that is Walmart quality or better, will provide 100% accuracy in conveying the 1's and 0's from one component to the other.

My point in mentioning the logic threshold of the receiver chip was that variations in its exact value, within normally expectable tolerances, may affect whether or not the receiver chip responds to reflection-induced distortion that may be present on the edges of the incoming signal waveform. (By "edges" I mean the transitions from the 0 state to the 1 state, and from the 1 state to the 0 state). And thereby affect the TIMING of the conversion of each sample to analog.

Signal reflections caused by impedance mismatches, as I explained and as the article describes, will propagate from the dac input circuit back to the transport output, and then partially re-reflect back to the dac input, where whatever fraction of the re-reflection that is not reflected once again will sum together with the original waveform.

If the cable length is such that the time required for that round trip results in the re-reflection returning to the dac input when the original waveform is at or near the mid-point of a transition between 0 and 1 or 1 and 0, since the receiver's logic threshold is likely to be somewhere around that mid-point the result will be increased jitter.

Again, no one is claiming that bits are not received by the dac with 100% accuracy. The claim is that the TIMING of the conversion of each sample to analog will randomly fluctuate. The degree of that fluctuation will be small, and will be a function of the many factors I mentioned (and no doubt others as well), but there seems to be wide acceptance across both the objectivist and the subjectivist constituents of the audiophile spectrum that jitter effects can be audibly significant.

If your friend disagrees with that, he should keep in mind two key facts, which he may not realize:

1)The S/PDIF and AES/EBU interfaces we are discussing convey both clock and data together, multiplexed (i.e., combined) into a single signal.

2)The timing of each of the 44,100 conversions that are performed each second by the dac is determined by the clock that is extracted from that interface signal.

Best regards,
-- Al
I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.
Mceljo 7-19-10

Disagree.... A digital cable can and will make a difference.
http://www.tnt-audio.com/accessories/digitalcables_e.html
http://www.tnt-audio.com/accessories/digitalcables_e.html

Not all transports sound a like. An example.....
http://www.stereophile.com/features/368/index8.html#
Entire text:
http://www.stereophile.com/features/368/index.html