I found this in a search. Maybe helpful to explain what is goin on:
"CD-ROMS and CD-R discs are encoded with Cross-Interleaved Reed-Solomon Code (CIRC). This code uses interleaving to distribute errors and parity to correct them. With a bit rate of over 4.3 million bits per second, the need for robust error correction is obvious. The error rates in the low-level decoding strategy are monitored over two levels -- referred to by most hardware manufacturers as C1 and C2. A third level of "Extended" Error Correction (ECD/ECC) is used in many (but not all) CD-ROM formats.
A disc's "Block Error Rate" (BLER) is the sum of corrections and passes made in the C1 decoder. The C1 decoder is designed to locate and correct up to two bytes of information on a CD block. If more than two bytes are detected, the entire block is passed to the de-interleaving stage and the C2 decoder.
In most cases, only a small amount of BLER represents uncorrectable blocks. The Red Book allows for a raw error rate in the C1 decoder of up to 3 percent of the possible blocks in errors per second over a ten-second range.
BRST (Burst Error) is a localized group of missing data, perhaps caused by a speck of dust or a scratch -- a burst of errors in one spot. It is the same data as that tested for BLER, but unscrambled (de-interleaved) before it is checked. Interleaving is aimed at correcting BRST. It is easier to correct one bit out of 10 bytes than 10 bits out of one 16-bit word, which is why the data is encoded or interleaved across an entire block.
Often referred to as E32 or E42 errors, a disc's uncorrectable error count represents the number of blocks that could not be corrected by the de-interleaving and C2 decoding stage. The block errors corrected or passed through the C2 decoder by and large tend to represent non-random or physical flaws, which cause the most concern in CD-R testing.[These are the "C2" errors] While CD-R discs frequently have lower BLER rates than pressed discs, they far exceed their replicated brethren in E32s and other uncorrectables, since by definition the Red Book specification does not allow any errors to pass the C2 decoder."
Another quote from a CD mastering source:
"C1 Errors.
C1 Errors refer to the block error rate (BLER), which consists of bit errors at the lowest level. C1 errors are always expressed
in errors per second. All CDs and CDRs contain C1 errors. They are a normal result of the write process. However, the maximum C1 error rate for a quality recording is an average of 220 errors per second based on 10 second samples.
C2 Errors.
C2 Errors refer to bytes in a frame (24 bytes per frame, 98 frames per block) and is an indication of a CD player's attempt to use error correction to recover lost data. C2 errors can be serious. A CD player may correct them, then again, it may not.
C2 errors are usually an indication of poor media quality, or a CD writer's failure to produce a quality burn (see conclusion).
CU Errors.
CU Errors refer to uncorrectable errors that are present after C2 error correction. No CU errors are allowed in a recorded disc. Generally, discs with CU errors cannot be played at all because they contain data that cannot be recovered.
Conclusion.
CD replicators consider a disc with an average of 220 C1 errors per second, "a good quality disc." Typically, our masters average less than 1 C1 error per second with absolutely no C2 or CU errors. We have our own standard which states that in addition to no C2 or CU errors, we will not ship any disc that averages more than 2 C1 errors per second."
So, if you have a disc with thousands of C1 and C2 errors, the chances are pretty good that the results will be audible.
If you have one with 20 or so, chances are it will sound will be mostly uneffected by error correction in the CD player.
I am using the CDROM scanner to selectively weed out CDs that may need replacement, or ones that I would not use as a ripping source for archiving.
On could use one's ears solely for this, but I prefer some quntatative data or measurements to support the process.
"CD-ROMS and CD-R discs are encoded with Cross-Interleaved Reed-Solomon Code (CIRC). This code uses interleaving to distribute errors and parity to correct them. With a bit rate of over 4.3 million bits per second, the need for robust error correction is obvious. The error rates in the low-level decoding strategy are monitored over two levels -- referred to by most hardware manufacturers as C1 and C2. A third level of "Extended" Error Correction (ECD/ECC) is used in many (but not all) CD-ROM formats.
A disc's "Block Error Rate" (BLER) is the sum of corrections and passes made in the C1 decoder. The C1 decoder is designed to locate and correct up to two bytes of information on a CD block. If more than two bytes are detected, the entire block is passed to the de-interleaving stage and the C2 decoder.
In most cases, only a small amount of BLER represents uncorrectable blocks. The Red Book allows for a raw error rate in the C1 decoder of up to 3 percent of the possible blocks in errors per second over a ten-second range.
BRST (Burst Error) is a localized group of missing data, perhaps caused by a speck of dust or a scratch -- a burst of errors in one spot. It is the same data as that tested for BLER, but unscrambled (de-interleaved) before it is checked. Interleaving is aimed at correcting BRST. It is easier to correct one bit out of 10 bytes than 10 bits out of one 16-bit word, which is why the data is encoded or interleaved across an entire block.
Often referred to as E32 or E42 errors, a disc's uncorrectable error count represents the number of blocks that could not be corrected by the de-interleaving and C2 decoding stage. The block errors corrected or passed through the C2 decoder by and large tend to represent non-random or physical flaws, which cause the most concern in CD-R testing.[These are the "C2" errors] While CD-R discs frequently have lower BLER rates than pressed discs, they far exceed their replicated brethren in E32s and other uncorrectables, since by definition the Red Book specification does not allow any errors to pass the C2 decoder."
Another quote from a CD mastering source:
"C1 Errors.
C1 Errors refer to the block error rate (BLER), which consists of bit errors at the lowest level. C1 errors are always expressed
in errors per second. All CDs and CDRs contain C1 errors. They are a normal result of the write process. However, the maximum C1 error rate for a quality recording is an average of 220 errors per second based on 10 second samples.
C2 Errors.
C2 Errors refer to bytes in a frame (24 bytes per frame, 98 frames per block) and is an indication of a CD player's attempt to use error correction to recover lost data. C2 errors can be serious. A CD player may correct them, then again, it may not.
C2 errors are usually an indication of poor media quality, or a CD writer's failure to produce a quality burn (see conclusion).
CU Errors.
CU Errors refer to uncorrectable errors that are present after C2 error correction. No CU errors are allowed in a recorded disc. Generally, discs with CU errors cannot be played at all because they contain data that cannot be recovered.
Conclusion.
CD replicators consider a disc with an average of 220 C1 errors per second, "a good quality disc." Typically, our masters average less than 1 C1 error per second with absolutely no C2 or CU errors. We have our own standard which states that in addition to no C2 or CU errors, we will not ship any disc that averages more than 2 C1 errors per second."
So, if you have a disc with thousands of C1 and C2 errors, the chances are pretty good that the results will be audible.
If you have one with 20 or so, chances are it will sound will be mostly uneffected by error correction in the CD player.
I am using the CDROM scanner to selectively weed out CDs that may need replacement, or ones that I would not use as a ripping source for archiving.
On could use one's ears solely for this, but I prefer some quntatative data or measurements to support the process.