Does the quality of a digital signal matter?


I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.

I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.

An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.

I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.

I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
mceljo
That's exactly the question. I would suspect that the vast majority of audiophiles use analogue connections to avoid ever having to make the transition into a digital format.

If a CD player is doing nothing more than reading the digital data (i.e. ones and zeros) on the disk and sending the information to another component for the D/A converstion what could effect the signal?

Do you insist on having a $1,000 CD drive in your computer to ensure that you have an accurate copied disk? How about burning a disk on an external drive using a USB cable, is there any risk of not getting a perfect duplication of the original without getting an error?
From How Stuff Works:

"In analog technology, a wave is recorded or used in its original form. So, for example, in an analog tape recorder, a signal is taken straight from the microphone and laid onto tape. The wave from the microphone is an analog wave, and therefore the wave on the tape is analog as well. That wave on the tape can be read, amplified and sent to a speaker to produce the sound.

In digital technology, the analog wave is sampled at some interval, and then turned into numbers that are stored in the digital device. On a CD, the sampling rate is 44,000 samples per second. So on a CD, there are 44,000 numbers stored per second of music. To hear the music, the numbers are turned into a voltage wave that approximates the original wave."

What this means is that until the numbers of the digital signal are converted back to a analog voltage wave via a D/A converter the only thing that matters is that the signal be transferred.

Maybe this is my my CD player recommends using a digital connection to my receiver rather than analog. At my equipment level he preservation of the analog signal isn't able to match bypassing the component all together and "shortening" the path of the analog signal.
This is something laypeople never seem to grasp. The entire advantage of digital is just that = "signal quality matters much less than analog".

This is a FACT.

The idea of representing information as 1's and 0's means that information can be stored and transmitted with no loss - something that is IMPOSSIBLE to achieve with analog.

So I would say YES - signal quality is much less of a factor in digital than in analog.

In fact the biggest source of quality differences with digital audio is the conversion to analog - this is where differences are audible - in the quality of the D to A converter.

You can copy a CD with a cheap drive 1000 times and it will be the same (copy of a copy) however it will sound better with a dedicated high quality DAC or a good quality CD player.
CD's are not 0s and 1s. They are pits burnt into the metal layer and are measured for lenght by the lazer and then converted into a digital format. Then they are run through a Digital to Analog converter.
If Shadorne's argument is correct then why should the quality of the DAC or CD player matter? Even cheap one seem to measure very well. "Signal quality much less of a factor than in analog"? No wonder my LPs sound so much better. Why do transports make such a difference? A friend of mine didn't believe they would until he heard different ones on his system. Why do the best CD playback systems cost so much unless they are susceptible to degradations just as analog is?