Re the original post:
If you want the answer, this is it. I run an LP mastering operation so I have seen this first hand. When you are cutting an LP from a digital source, it should come as no surprise whatsoever that the digital source is the master tape or master file.
That file will thus play back with less bit loss; anyone who has issued a CD from a master tape knows that the area of the biggest degradation occurs between the master tape and the final duplicate CD. Yes, I know, its not supposed to happen that way but it most certainly does.
OTOH when you are cutting from the master file, these days its very common for the master file to be 24 bit and at a higher scan frequency. When you play the LP back, you can actually have less distortion than you can have on the CD playback. While it is true that an LP can and usually does have more THD, it is also true that it has far less IM distortion. Of the two, the ear really does not like IM!
Where does the IM distortion come from on a CD? It is a product of intermodulation (inharmonic) with the scan frequency. Its not a distortion listed when you see digital specs, but it should be, as it is the elephant in the room when it comes to problems in the digital recording/playback system. The ear treats this distortion as brightness BTW. That is why the CD can measure perfectly flat but sounds bright.
When the industry made the transition to digital, the fact that the ear behaves this way was not clearly understood. In fact if you are reading this you now have a leg up on a lot of audio engineers, as this phenom is still not well understood 30 years on. I think the industry does not like to talk about it....
Anyway, that is why the LP often sounds better than the CD even when they have the same master. Of course YMMV as setup in an analog reproducer is paramount!
If you want the answer, this is it. I run an LP mastering operation so I have seen this first hand. When you are cutting an LP from a digital source, it should come as no surprise whatsoever that the digital source is the master tape or master file.
That file will thus play back with less bit loss; anyone who has issued a CD from a master tape knows that the area of the biggest degradation occurs between the master tape and the final duplicate CD. Yes, I know, its not supposed to happen that way but it most certainly does.
OTOH when you are cutting from the master file, these days its very common for the master file to be 24 bit and at a higher scan frequency. When you play the LP back, you can actually have less distortion than you can have on the CD playback. While it is true that an LP can and usually does have more THD, it is also true that it has far less IM distortion. Of the two, the ear really does not like IM!
Where does the IM distortion come from on a CD? It is a product of intermodulation (inharmonic) with the scan frequency. Its not a distortion listed when you see digital specs, but it should be, as it is the elephant in the room when it comes to problems in the digital recording/playback system. The ear treats this distortion as brightness BTW. That is why the CD can measure perfectly flat but sounds bright.
When the industry made the transition to digital, the fact that the ear behaves this way was not clearly understood. In fact if you are reading this you now have a leg up on a lot of audio engineers, as this phenom is still not well understood 30 years on. I think the industry does not like to talk about it....
Anyway, that is why the LP often sounds better than the CD even when they have the same master. Of course YMMV as setup in an analog reproducer is paramount!