Does the quality of a digital signal matter?


I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.

I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.

An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.

I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.

There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.

I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
mceljo
This is another one of those issues/questions that comes up now and then (like double-blind testing, differences in cables, etc), and gets talked about a lot for a while. The things that always seem true with the threads include: 1) very few people agree; and 2) people make fairly bold statements one way or the other (often without actual personal experience, e.g., having compared cables under *controlled* conditions)

If the question is "have you heard differences in the same system and same room, using transport A vs transport B?", my answer is "yes..definitely". (if one wants to "disagree or argue with what I experienced, that's a "dead-end" I see no point in going down) If you are asking "why?" or "how big a difference", or "is it worth it", etc...well, those are different questions.

p.s. While the question speaks of digital, the OP seems to forget (or not know?) that analog is involved in a CD player, at least one that is not using an external DAC.
p.s. I had another thought : )

It seems to me that, when it comes to audio, we clearly don't know everything yet !
It seems to me that the bit stream speed is independent of the bit content. If this is correct than should not the jitter be either constant of possible a function of the disc itself (like radial position or burn/pressing quality)?

That was assumed when CD players were first invented. However, many things can affect the accuracy of the clock signal in the DAC. And even the bitsream is variable - error bursts and misreads may be cyclical and perhaps only the digital "preamble" is fairly consistent - so the data may vary in a certain repeating patterns.

Provided jitter is random, it is in general a negligible problem. However when patterns - such as power supply oscillations due to cyclical laser servo movements to track the pits on the rotating disc occur - then we can get non-random jitter. Another major cause of non-random jitter may be the Phase Locked Loop between teh master and slave clock - in this case, the very act of trying to keep the slave clock in time with the master cause oscillatory patterns as the slave hunts back and forth trying to keep in time. These repetitive patterns in clock timing erros cause new oscillatory audio signals to appear in the analog music coming out of the DAC - sometimes called sidebands - non-harmonically related signals. It is these very small (-40 db) but 'correlated' sounds that become audible - usually as hash or lack of clarity in upper midrange and HF (although this may significantly affect the perceptive sound of percussive instruments with low frequencies - like piano or drums - due to the way we "hear")

Anyway - jitter is an analog problem - it only appears upon conversion to analog or, up front, when converting analog to digital.

If you have a perfect clock then you will not have jitter.
DAC's have evolved to have better clocks. Early designs like Meitner used patterns in the digital data called "preamble" to try and achieve a more accurate clock. Others like Lavry used algorithms to maintain a very slow correction pattern on the slave clock that could be filtered out. Since about 2002 the problem has been substantially addressed by "asynchronous DACs" - basically these type DACs ignore the master clock altogther - and in these designs the jitter is totally determined by the clock quality in the DAC along and nothing upstream of the DAC.
On thing to note is that the information in one of the long and detailed articles linked in this discussion is 17 years old. My EE friend pointed out that a 2x CD player was a big deal 17 years ago. I would hope that many of the problems described have been reduced or solved by now. When it comes to electronics, 17 years is a very long time for technology to develop.

I picked up an SACD player from a friend today to borrow for a few days. We'll see how much difference there is once I get a copies of a single album in both formats. It'll probably be Nora Jones since I already have the CD and know it's a quality recording.
I picked up an SACD player from a friend today to borrow for a few days. We'll see how much difference there is once I get a copies of a single album in both formats. It'll probably be Nora Jones since I already have the CD and know it's a quality recording.
Apples and oranges......

I suggest you compare the two players just using the CDs you have now. You should be able to hear a difference between the two players.

Post back your findings.
.