Differences with CD ripping speeds audio folklore?


I have often wondered why people claim that lower CD ripping speeds produce a higher quality resulting WAV file. After all, wouldn`t people avoid using CD-ROM`s that routinely produce errors? Computer data demands high accuracy, or else programs may not work correctly or data may be inaccurate. In addition, CD`s are encoded with redundant data that allows the drives to automatically correct many errors, and detect those that it cannot correct. So why should reading an audio CD be any different?

So I conducted a test this morning. I used one of my old machines which had an older CD ripping program that allowed me to choose the speed of the rip. I chose 1x. On my newer machine, I used MusicMatch Jukebox to rip it, which averaged at about 25X. I transferred all the files over to my Unix machine and did a bitwise comparison on them. As expected, they are IDENTICAL.

So could the theory that lower CD ripping speeds sounding better be yet another example of audio folklore?

Michael
128x128Ag insider logo xs@2xsufentanil
I can't comment on whether the copy or the original are identical because the usual way of assessing the original CD data is to extract it onto a hard drive.

Since two different drives produced bit-identical results, I can say with a high probability that the copies also match the original.

CD's have encoded ways of detecting and correcting data errors, and for detecting when the errors are too extensive to be corrected. So if there is considerable disc damage, shouldn't there be an error message?

Michael
I've always been convinced that there is a lot of folklore around the handling of digital data, including the burning of CDRs at differents speeds. There's plenty I quite possibly don't understand or know, but I've always been satisfied that I can prove that I can get bit-perfect copies, something which you can do with freeware or shareware programs.

I'm also convinced that if we can't handle digital music "perfectly", that the audio industry is dropping the ball. We can do much higher data rates basically "perfectly" in any number other applications, so if we can't do it in the audio environment, we're not trying hard enough
From personal experience: Many beat up CD's don't seem to rip well at higher speeds. If I get write errors, I will slow down the speed and usually it will go.
If the original recording was really good, I can tell the difference between the original and the copy.
I never tried comparing copies of different speeds though
Audphile1, you say that you can tell the difference between an original CD and a copy. How is this possible? After all, if, when copying a CD, it routinely made mistakes, we would not be using CD's for computers. We'd still be using floppies or zip discs or whatever, since high data accuracy is a must for computers.

My experience in burning CD's for computers has been that it either works fine, or it doesn't work at all (in which case it becomes a coaster). I've never had it be somewhere in between, with scattered data flaws.

Can someone explain this to me?

Michael