It isn't the bits, it's the hardware


I have been completely vindicated!

Well, at least there is an AES paper that leaves the door open to my observations. As some of you who follow me, and some of you follow me far too closely, I’ve said for a while that the performance of DAC’s over the last ~15 years has gotten remarkably better, specifically, Redbook or CD playback is a lot better than it was in the past, so much so that high resolution music and playback no longer makes the economic sense that it used to.

My belief about why high resolution music sounded better has now completely been altered. I used to believe we needed the data. Over the past couple of decades my thinking has radically and forever been altered. Now I believe WE don’t need the data, the DACs needed it. That is, the problem was not that we needed 30 kHz performance. The problem was always that the DAC chips themselves performed differently at different resolutions. Here is at least some proof supporting this possibility.

Stereophile published a link to a meta analysis of high resolution playback, and while they propose a number of issues and solutions, two things stood out to me, the section on hardware improvement, and the new filters (which is, in my mind, the same topic):



4.2
The question of whether hardware performance factors,possibly unidentified, as a function of sample rate selectively contribute to greater transparency at higher resolutions cannot be entirely eliminated.

Numerous advances of the last 15 years in the design of hardware and processing improve quality at all resolutions. A few, of many, examples: improvements to the modulators used in data conversion affecting timing jitter,bit depths (for headroom), dither availability, noise shaping and noise floors; improved asynchronous sample rate conversion (which involves separate clocks and conversion of rates that are not integer multiples); and improved digital interfaces and networks that isolate computer noise from sensitive DAC clocks, enabling better workstation monitoring as well as computer-based players. Converters currently list dynamic ranges up to∼122 dB (A/D) and 126–130 dB(D/A), which can benefit 24b signals.

Now if I hear "DAC X performs so much better with 192/24 signals!" I don't get excited. I think the DAC is flawed.
erik_squires
“There are no uncorrectable errors.” That’s precisely what the industry said ever since day one. Hel-loo! “Perfect Sound Forever.” The Reed Solomon codes and the laser servo feedback mechanism were supposed to take care of any errors. What a joke. Buffering doesn’t stop all laser reading errors, by the way, only certain specific ones. If it did portable Walkman CD players would be perfect. Buffering only puts off the inevitable for a few seconds.
Buffering doesn't stop any errors, it provides the mechanism to eliminate all jitter. 

No one ever said no uncorrectable errors, though when manufactured, based on many industry tests, uncorrectable errors are quite rare. These are test pretty easily recreated on any computer with a CD-rom as well, so it is no "secret".

From a practical standpoint, if you treat your CD the way the average audiophile does, uncorrected errors are not going to impact your listening experience.   However, if you are just going to make stuff up, then I am not sure why you are participating in the conversation?
No. I am using the proper definition for jitter as it applies to the output of the CD player, not what comes off the disc which is meaningless in a buffered and reclocked player, i.e. modern audiophile players. I have no idea what you are using.
So, modern CD players don’t use Reed Solomon codes or laser servo feedback? Reclocking and buffering takes care of everything?