Hi RVPiano ...
I think I can help you here in a way few other A’goners can. :) I used to work in motion picture sound, and Dolby-A was the noise reduction system used for Dolby Surround. I’ve taken those cards apart, and reverse engineered other bits of Dolby gear from the purely analog era.
I wouldn’t call Dolby audiophiles at all, certainly not back then. They were into effects, and noise and distortion was not high on their agenda to prevent, let alone "microdynamics" or anything close to that. While I believe the idea behind Dolby-A was a good one, to compress high frequency content so it would put less of a strain on tape, and later film, they always implemented their circuits with a huge amount of parts without particular care for anything besides the main goal. They also weren’t very quick to update their products with modern integrated parts which would have given them lower noise, lower distortion results for less cash. Their product cycles lasted a good long time. Even using purely analog parts, Dolby A would be absolutely trivial to implement today with a handful of IC's, but back then was an electronics store of discrete parts.
We’d have to get a hold of those cards, of course, and attempt an encoding and decoding cycle to see exactly what was going on, but in general, based on the circuits I got to see and the performance I measured in the pro gear, I’d say it’s most likely that the problem wasn’t the idea of Dolby-A but the way Dolby would implement their circuits, going all the way back to the power supplies.
There were a number of poor choices in their film gear, which absolutely left too much noise and blurred too much detail which was in the tracks, so I can easily see how this could have been in the tape recording products.
While we can implement Dolby-A decoding digitally today and spare ourselves some of these problems, we can’t go backwards in time and remove the original encoding and what that must have done to the sound.
Best,
Erik
I think I can help you here in a way few other A’goners can. :) I used to work in motion picture sound, and Dolby-A was the noise reduction system used for Dolby Surround. I’ve taken those cards apart, and reverse engineered other bits of Dolby gear from the purely analog era.
I wouldn’t call Dolby audiophiles at all, certainly not back then. They were into effects, and noise and distortion was not high on their agenda to prevent, let alone "microdynamics" or anything close to that. While I believe the idea behind Dolby-A was a good one, to compress high frequency content so it would put less of a strain on tape, and later film, they always implemented their circuits with a huge amount of parts without particular care for anything besides the main goal. They also weren’t very quick to update their products with modern integrated parts which would have given them lower noise, lower distortion results for less cash. Their product cycles lasted a good long time. Even using purely analog parts, Dolby A would be absolutely trivial to implement today with a handful of IC's, but back then was an electronics store of discrete parts.
We’d have to get a hold of those cards, of course, and attempt an encoding and decoding cycle to see exactly what was going on, but in general, based on the circuits I got to see and the performance I measured in the pro gear, I’d say it’s most likely that the problem wasn’t the idea of Dolby-A but the way Dolby would implement their circuits, going all the way back to the power supplies.
There were a number of poor choices in their film gear, which absolutely left too much noise and blurred too much detail which was in the tracks, so I can easily see how this could have been in the tape recording products.
While we can implement Dolby-A decoding digitally today and spare ourselves some of these problems, we can’t go backwards in time and remove the original encoding and what that must have done to the sound.
Best,
Erik