I studied a formula for jitter and how it relates to human perception some years back. I'd have to go look it up as I don't recall it exactly; the limiting number is related to the number of bits and the sample rate. Increasing the number of bits and/or the sample rate makes it more critical. The reason it is so audible is it affects the zero crossing of music, something to which the ear is especially sensitive. The Redbook standard at 44KHz and 16 bits is less than 50 picoseconds. Clearly, we have a ways to go to make jitter a nonissue.
I can tell you the reason we have jitter problems, besides the fact that the basic CD clocks are not all the accurate, is the sample clock is encoded in the data stream. The clock is not a separate signal path from the data which makes jitter an inherent problem in the system. Whether this was known or considered an issue when the CD system was originally conceived is a good question.
When Sony designed the CD, a number of weaknesses were created in the design due to the size limitations of the CD. Sony's president, whose name I have forgotten, wanted it to easily fit into a car stereo and also wanted Vivaldi Four Seasons to fit on a single disk without flipping it over or inserting another disk. This set a limitation on the sample rate, not to an advantage, and the number of bits, also not to an advantage since they had to fit all the music onto a small platter. The original concept was to have a CD the same size as an LP since the stores were already shelved and geared for that size.
To be fair though, at the time the CD was designed, our technology and semiconductor processes were really pushed to develop a good quality, low distortion, inexpensive DAC at 16 bits and 44 KHz. I believe 18 bits and 50 KHz was about the limit, given the cost limitations. I sure wish we had that in a CD, though!
As for measuring jitter and tuning fork accuracy, we have time base standards that can easily resolve better than 1x10^-14 seconds - way beyond what a human can perceive. They are pricey but they can do it. Gosh, the digital time base standard I have on my bench, which I bought for RIAA measurements, measures to less than 1x10^-6 seconds and is still in calibration, and that was surplus at $50!
I can tell you the reason we have jitter problems, besides the fact that the basic CD clocks are not all the accurate, is the sample clock is encoded in the data stream. The clock is not a separate signal path from the data which makes jitter an inherent problem in the system. Whether this was known or considered an issue when the CD system was originally conceived is a good question.
When Sony designed the CD, a number of weaknesses were created in the design due to the size limitations of the CD. Sony's president, whose name I have forgotten, wanted it to easily fit into a car stereo and also wanted Vivaldi Four Seasons to fit on a single disk without flipping it over or inserting another disk. This set a limitation on the sample rate, not to an advantage, and the number of bits, also not to an advantage since they had to fit all the music onto a small platter. The original concept was to have a CD the same size as an LP since the stores were already shelved and geared for that size.
To be fair though, at the time the CD was designed, our technology and semiconductor processes were really pushed to develop a good quality, low distortion, inexpensive DAC at 16 bits and 44 KHz. I believe 18 bits and 50 KHz was about the limit, given the cost limitations. I sure wish we had that in a CD, though!
As for measuring jitter and tuning fork accuracy, we have time base standards that can easily resolve better than 1x10^-14 seconds - way beyond what a human can perceive. They are pricey but they can do it. Gosh, the digital time base standard I have on my bench, which I bought for RIAA measurements, measures to less than 1x10^-6 seconds and is still in calibration, and that was surplus at $50!