Kthomas I agree with you about cheap and effective. But not perfect. There is jitter introduced at all stages of the digital to analoge process. For the digital to digital I was not referring to the error encoding of the original being changed. I'm actually talking about clock in the CDR that governs the placement of the copied bit on the cdr media. When it is not dead on perfect (parts per billion or less) The the decoded signal on playback has phase errors in the analog signal. Ideally even if you had an imperfect clock that would repeat its errors reliably and it was used to burn the CDR and play it back you would have no error in phase in the analog output. But you don't.
Also one other issue. No player reads every bit perfectly from the original play. That's why we have error encoding built in to the process. So you have to read every bit perfectly (typically 700K of them) and then place them back on the CDR without a signal one out of place. If the CDR has a flaw in its manufacture or the laser varies in intensity or location.//error.
Also one other issue. No player reads every bit perfectly from the original play. That's why we have error encoding built in to the process. So you have to read every bit perfectly (typically 700K of them) and then place them back on the CDR without a signal one out of place. If the CDR has a flaw in its manufacture or the laser varies in intensity or location.//error.