The best system I’ve heard - by far - is one for which the Reed Solomon error correction code subsection was disabled. Yes, I know what you’re thinking - is he out of mind? And once you stabilize the CD there is almost no need for the CD laser servo feedback system. Once you fix the underlying problems in the CD transport there is no need for all the patchwork fixes. The original designers obviously knew they had some problems with CD playback, they just didn’t know what all of the problems were or they ignored them. Do modern CD players just wish the scattered light problem away? I’ve never heard anyone even address the issue. If you could hear what I’ve heard with my ears.
It isn't the bits, it's the hardware
I have been completely vindicated!
Well, at least there is an AES paper that leaves the door open to my observations. As some of you who follow me, and some of you follow me far too closely, I’ve said for a while that the performance of DAC’s over the last ~15 years has gotten remarkably better, specifically, Redbook or CD playback is a lot better than it was in the past, so much so that high resolution music and playback no longer makes the economic sense that it used to.
My belief about why high resolution music sounded better has now completely been altered. I used to believe we needed the data. Over the past couple of decades my thinking has radically and forever been altered. Now I believe WE don’t need the data, the DACs needed it. That is, the problem was not that we needed 30 kHz performance. The problem was always that the DAC chips themselves performed differently at different resolutions. Here is at least some proof supporting this possibility.
Stereophile published a link to a meta analysis of high resolution playback, and while they propose a number of issues and solutions, two things stood out to me, the section on hardware improvement, and the new filters (which is, in my mind, the same topic):
Now if I hear "DAC X performs so much better with 192/24 signals!" I don't get excited. I think the DAC is flawed.
Well, at least there is an AES paper that leaves the door open to my observations. As some of you who follow me, and some of you follow me far too closely, I’ve said for a while that the performance of DAC’s over the last ~15 years has gotten remarkably better, specifically, Redbook or CD playback is a lot better than it was in the past, so much so that high resolution music and playback no longer makes the economic sense that it used to.
My belief about why high resolution music sounded better has now completely been altered. I used to believe we needed the data. Over the past couple of decades my thinking has radically and forever been altered. Now I believe WE don’t need the data, the DACs needed it. That is, the problem was not that we needed 30 kHz performance. The problem was always that the DAC chips themselves performed differently at different resolutions. Here is at least some proof supporting this possibility.
Stereophile published a link to a meta analysis of high resolution playback, and while they propose a number of issues and solutions, two things stood out to me, the section on hardware improvement, and the new filters (which is, in my mind, the same topic):
4.2
The question of whether hardware performance factors,possibly unidentified, as a function of sample rate selectively contribute to greater transparency at higher resolutions cannot be entirely eliminated.
Numerous advances of the last 15 years in the design of hardware and processing improve quality at all resolutions. A few, of many, examples: improvements to the modulators used in data conversion affecting timing jitter,bit depths (for headroom), dither availability, noise shaping and noise floors; improved asynchronous sample rate conversion (which involves separate clocks and conversion of rates that are not integer multiples); and improved digital interfaces and networks that isolate computer noise from sensitive DAC clocks, enabling better workstation monitoring as well as computer-based players. Converters currently list dynamic ranges up to∼122 dB (A/D) and 126–130 dB(D/A), which can benefit 24b signals.
Now if I hear "DAC X performs so much better with 192/24 signals!" I don't get excited. I think the DAC is flawed.
- ...
- 64 posts total
erik_squires Given that the Mytek was better in all ways AND also had such a slim difference in performance I concluded that maybe the problem was not the data, as we have so often thought, but how well the DACs behaved with Redbook.That's certainly possible. There are some other variables - which some here have noted - including the synergy of the DACs and your subjective observation that the Mytek "was better in all ways." If upsampling works, at all, then it means the DAC does not perform equally at all resolutions. It has nothing to do with missing data.It isn't clear how you've made the leap from, "maybe the problem was not the data" to, "It has nothing to do with missing data." That upsampling can be useful doesn't necessarily mean that more data won't improve results. It's risky to form an absolute conclusion from just a single test. |
That upsampling can be useful doesn't necessarily mean that more data won't improve results. @cleeds But that's just it, with upsampling, you are not generating more data. There's no more clarity or resolution, or harmonics. There's not yet an AI that is listening to a trumpet and saaying "oh, I know how a trumpet sounds at 384k, I can fill in those gaps. " At best, upsampling is curve fitting. If we say that upsampling for a particular dac is a significant improvement, then it's not the data contents, because it is largely the same, it is how well the DAC chip performs with more of it. |
- 64 posts total