To the OP: Take it from another EE; your friend is correct. Everything that matters as to signal quality occurrs before and after the digitization. You cannot do anything about what happens before the signal is digitized so if you want to improve quality concentrate on what happens afterwards. Moving the digital signal around is not a place to spend any effort.
Does the quality of a digital signal matter?
I recently heard a demonstration where a CD player was played with and without being supported with three Nordost Sort Kones. The difference was audible to me, but did not blow me away.
I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.
An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.
I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.
There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.
I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
I was discussing the Sort Kones with a friend of mine that is an electrical engineer and also a musical audio guy. It was his opinion that these items could certain make an improvement in an analogue signal, but shouldn't do anything for a digital signal. He said that as long as the component receiving the digital signal can recognize a 1 or 0 then the signal is successful. It's a pass/fail situation and doesn't rely on levels of quality.
An example that he gave me was that we think nothing of using a cheap CDRW drive to duplicate a CD with no worry about the quality being reduced. If the signal isn't read in full an error is reported so we know that the entire signal has been sent.
I believe he said that it's possible to show that a more expensive digital cable is better than another, but the end product doesn't change.
There was a test done with HDMI cables that tested cables of different prices. The only difference in picture quality was noted when a cable was defective and there was an obvious problem on the display.
I realize that the most use analogue signals, but for those of us that use a receiver for our D/A, does the CD players quality matter? Any thoughts?
- ...
- 67 posts total
It's not just 0s and 1s.....page 54 |
Error checking and correction is very loose in CDPs since it has to read CD in real time (dust, scratches). There are programs like MAX for Mac that read music CD as data going multiple time to the same sector until finds right checksum. My CDP plays and Itunes rips CDs that MAX refuses to read or reads extremely long time. Digital data from CDP is jittery (contains jitter - noise in time domain). Jitter creates sidebands at very low level (in order of <-60dB) but audible since not harmonically related to root frequency. With music (many frequencies) it means noise. This noise is difficult to detect because it is present only when signal is present thus manifest itself as a lack of clarity. Jitter can be suppressed by asynchronous upsampling DACs (like Benchmark DAC1) or reclocking devices. Jitter depends on quality of CDP transport and power supply. Typical digital transition of CDP is in order of 25ns making it susceptible to noise (slow crossing of threshold). High quality transports can transition many times faster reducing noise coupling but creating a lot of problems with reflections on cable characteristic impedance boundaries (therefore require better digital cable). Jitter in D/A playback can be suppressed but recorded jitter in A/D process stays forever. For some early A/D conversions the only option is to convert it again if analog tapes still exist. |
Post removed |
Yes, it matters. All the bits have to be retrieved and transmitted accurately. Then the bits comprising each sample have to be converted to the proper analog voltage level by the DAC at precisely the right time. Variations in these two fundamental operations will affect sound quality to some extent. The good news is the technology needed to do this reliably and within tolerances needed to produce good results, at least with CD redbook digital is becoming quite mature and is not radically expensive. Different devices will produce different results however and the differences are often audible. |
- 67 posts total