Do CD-R's sound the same as originals


does a burned copy of a cd sound the same as the original
soundwatts5b9e
Well, just to be sure we keep it all straight, nobody has suggested that a CDR could "better" the original CD (aluminum, gold or otherwise) under any conditions. Rather, the basic premise is that you can make a copy of a CD onto a CDR and have the CDR be a perfect bit-image of the original. It's also very easy to make a "copy" that isn't a perfect bit-image of the original. It appears that a lot of people making comparisons between orginal CDs and CDR copies aren't sure of whether the CDR is the former or the latter. It seems safe to say that, regardless of how much difference somebody hears in the perfect bit-image copy vs. the original, that the non-perfect bit-image will sound more different, undoubtedly for the worse.

I agree with you that beauty is in the ears of the beholder, and that's what makes this a great hobby. If you place significant value on bits stored on a particular kind of CD over identical bits on another, more power to you.

I do have to disagree on the comment about the Pioneer fellow - if he listens to my system and says he hears no difference, I'm gonna tell him he's full of it - my mother could hear a difference :-) Now, if he says the difference isn't worth the cost differential, I'm not going to try to change his mind about that.

Many respect the opinions of M. Fremer of Sterophile. In the Jan. 2001 issue, Fremer compares a "consumer grade" (list $600.) JVC CD Recorder, and a more high end Marantz ($1600.-- I think). When copying CDs at 1X speed with either of these CD-Rs, he could detect "no significant difference from the originals". But when recorded at 2X he noticed some "hardening of sound". For doing analog to digital recording, the Marantz was better, ie better AD conversion. You need to read the review to get all the details of his review, but it made me feel good about my Pioneer W739 ($600. list). The Pioneer W739 is very similar in price and features to the JVC, and as I've stated above, at 1X recording, I can't tell CD-R copies from originals-- hope this isn't beating a "dead horse". Cheers. Craig
For those so inclined, I have copied a posting from Audio Asylum I coincidentally ran into a couple days before looking at this post. Some interesting information from audio maestro, Jon Risch... When the digital data from a CD is copied to a hard drive, instead of the data having to flow off of a CD-ROM drive (which all differ in their abiltiy to cleanly extract the audio data, as they are optimized for computer file data, and are not specifically geared toward the detection and lock to the audio header blocks), it comes directly off of the hard drive, and the data flow is much smoother and consistent, and there is less chance of missing or erroneous data. It is demands on the power supply, and the resultant LIM that causes the data to be recorded with any jitter in the first place, and data flow from a HD is also less demanding than data flow from a CD-ROM. See my repost below on jitter: REPOST of Jitter reply: A very common misconception about digital signal transmission with respect to audio is that if the signal does not get corrupted to the point of losing or changing the 1's and 0's, that nothing else can go wrong. If the transmission system had been designed with cost no object, and by engineers familiar with all the known foibles and problems of digital transmission of audio signals, then this might be subtantially true. No differences could rear their ugly head. Unfortunately, the systems we ended up with DO NOT remain unaffected by such things as jitter, where the transistion from a 1 to a 0 is modulated with respect to time. There are many ways that jitter can affect the final digital to analog conversion at the DAC. Jitter on the transmitted signal can bleed or feed through the input reciever, and affect the DAC. How? Current drain on the power supplies due to the changing signal content and the varying demands made on the power supply to the logic chips and the DAC. Modulate the power supply rails, and the DAC will convert at slightly different times. HOWEVER the power supply gets modulated, it will affect the DAC. One version of this has been popularly refered to as LIM or Logic Induced Modulation by the audiophile press. See: "Time Distortions Within Digital Audio Equipment Due to Integrated Circuit Logic Induced Modulation Products" AES Preprint Number: 3105 Convention: 91 1991-10 Authors: Edmund Meitner & Robert Gendron Many of the logic chips in a digital audio system behave very poorly with respect to dumping garbage onto the rails and even worse, onto the ground reference point. Even as I post, logic manufacturers such as TI are advertising the benefits of their latest generation of logic chips that reduce ground bounce. The circuitry itself generates it's own interference, and this can be modulated by almost anything that also affects the power supply or ground. Who cares what the power supply rails or the ground is doing? The DAC cares, beacuse it is told to convert a digital signal value at a certain time. This time is determined by the master clocking oscillator, and when the DAC has determined that a transistion from logical one to a zero, or a logical zero to a one, has in fact occured. The point at which the DAC decides this has occured, depends on the absolute value of the power supply rails near the moment of detection/conversion. The purity of the master oscillator signal is also affected by PS and ground variations, as well as sound vibrations, and the activity of the various subsystems within the CD player/DAC box. If this master oscillator signal is not perfectly pure, and free from noise, phase jitter, and other artifacts, then even if the DAC was totally unaffected by PS perturbations (virtually impossible to accomplish), then the master oscillator signal itself would cause jitter. The amount of jitter that it takes to affect the analog output of the signal used to be thought of as fairly high, somewhere on the order of 1,000 to 500 pS worth. Now, the engineers on the cutting edge claim that in order for jitter to be inaudible and not affect the sound of the signal, it may have to be as low as 10 to 20 pS. That's for 16 bit digital audio. That's a very tiny amount of jitter, and easily below what most all current equipment is capable of. Computer systems never convert the 1's and 0's to time sensitive analog data, they only need to recover the 1's or 0's, any timing accuracy only has to preserve the bits, not how accurately they arrive or are delivered. So in this regard, computer systems ARE completely different than digital audio systems. Look into digital audio more thouroughly, and realize that the implementations are not perfect or ideal, and are sensitive to outside influences. Just because they could have been and should have been done better or more nearly perfect does not mean they were! People are not hearing things, they are experiencing the result of products designed to a cost point that perform the way they do in a real world because of design limitations imposed by the consumer market price conciousness all the mid-fi companies live and die by. Jitter read from a CD will affect how well the read servo stays locked, and how much the read servo has irregular power supply demands. Just about everything and anything affect the power supply, so reduce jitter read from the disc, and it will affect the accuracy of the playback event. With digital cables, there are three things that are paramount: proper impedance, proper cable termination, and wide bandwidth. It may be that a particular cable more nearly matches a systems actual impedance. The other factor, proper termination includes, but is not limited to the actual electrical termination inside the components, as well as the connector on the end of the cable. If the connector is NOT a perfect 75 ohm, 110 ohm, or whatever, it will cause minor reflections in the cable, which makes our old friend JITTER raise it's ugly head again. The third factor, bandwidth, is only an issue because both the AES/EBU and the SP/DIF interface formats were designed before Sony/Phillips knew all there was to know about digital problems, and they require PERFECT unlimited bandwidth cables in order for the transimission systems to be free of jitter. The more you limit the bandwidth, the more jitter. This is a known engineering fact, and an AES paper was given about this very subject not too long ago. "Is the AES/EBU/SPDIF Digital Audio Interface Flawed?" Preprint Number: 3360 Author: Chris Dunn Author: Malcolm O. J. Hawksford The effective data rate of SP/DIF is about 3 Mhz, and the design of the transmitters and recievers is abysmal. Maybe if evrything else was done right, then cables, etc. wouldn't matter. So much was done wrong or cost cut till it screwed up that they do come into the picture. A good web source for info on jitter is located at: http://www.digido.com/jitteressay.html and a further web source is at: http://www.audioprecision.com/publications/jan96.htm#Digital Audio Transmission Jon Risch
(From the above post) ""Jitter read from a CD will affect how well the read servo stays locked, and how much the read servo has irregular power supply demands. Just about everything and anything affect the power supply, so reduce jitter read from the disc, and it will affect the accuracy of the playback event.""...................(And likely the accuracy of the CD image file?). I interpret this also in the making CD-R copies with a personal computer. The better the read-software, the better the copy will be. Nero has a "jitter reduction" feature when the CD to be copied, gets read. The CD is read/re-read several times, before the final CD image is created and stored on the hard drive. For me, It took nearly 4 hours for it to read a 74 minute CD. That said, the software costs $50, and I'd rather demo other free software before I buy, even though I doubt there is any better software than Nero, at least where accuracy is a concern. Adaptec is perfect for burning CD's from MP3 files.
Centurymantra - your post focuses a lot on the transmission / reception of digital data to the DAC, and how the DAC processes it, etc. I'm not clear from reading it if you're suggesting that these problems are accentuated by using a CDR vs. the original CD, or you're just augmenting the general discussion of why people using different equipment might hear different results.

If you are suggesting that these other issues are affected by the use of a CDR vs. an original CD, to what do you attribute the effects? Something other than the CDR having different contents than the original CD?

I have found that RealJukebox Plus is perfect for creating bit-perfect images of audio CDs on a computer and it takes about 15 minutes plus 8 minutes for each additional copy I want.