One thing I’ve noticed about the various characteristics that go by the term ‘warmth’ in the context of playback is that most or all of them seem like ADDITIONS to the signal. For example, ADDITIONAL low order harmonics, ADDITIONAL lower midrange/upper bass, ADDITIONAL ambience provided by the listening room, and so on.
Strictly speaking, any additions to the signal (other than gain) are deviations from accuracy. For that reason, I think many audiophiles, myself included, are tempted to eschew them. But lately I've been having second thoughts about that attitude. I’m starting to wonder about the relative merits of the following two characteristics:
1. Accuracy to the recording
2. Accuracy to the recorded event
The relative merits of accuracy-to-the-recording vs. accuracy-to-the-recorded event has periodically occurred to me ever since, on the neutrality thread, Al wrote this:
Al makes his point about accuracy in terms of neutrality vs. transparency, in keeping with the nomenclature of that thread, but it is essentially the same distinction as accuracy-to-the-recording vs. accuracy-to-the-recorded-event, or Recording Accuracy vs. Event Accuracy, for short.
I agree with Al that Recording Accuracy correlates with Event Accuracy, but not perfectly so. In other words, I now believe that efforts to maximize Recording Accuracy sometimes come at the expense of Event Accuracy, which is a viewpoint that, I suspect, more experienced audiophiles tend to adopt, but has taken me some time to appreciate. A turning point for me was an observation that Albert Porter made in an old digital vs. analog thread, which I read only recently:
This observation resonated with me, as I have had the experience of recording, editing, and mixing with high quality professional equipment, to create a master recording I was proud of. I then watched - dismayed - as my master recording was compressed, downrez'd, and finally, transferred to its delivery format. Even on a very high quality playback system, the delivery format's recording was a shadow of its former self. Albert Porter’s observations about CD recordings undergoing this process of diminishment as a matter of routine procedure highlights the many respects in which the recordings available to consumers deviate dramatically from their master recordings, to say nothing of how the master recordings themselves deviate from the recorded events. Taken together, both deviations create a gulf between the live event and its consumer playback, a gulf that some audiophiles try to fill with ADDITIVE measures. And that brings me back to the point of this post...
It now seems to me that the use of ADDITIVE measures can be a means of filling, to whatever extent possible, the gulf between the live event and the (in many cases) extremely diminished recordings available to consumers. IMO, that provides a plausible rationale for sacrificing a small measure of Recording Accuracy for the sake of potentially greater Event Accuracy. Put another way, it provides a rationale for the ADDITIVE approach to playback.
Just which types of additions are the right ones is another matter entirely.
Bryon
Strictly speaking, any additions to the signal (other than gain) are deviations from accuracy. For that reason, I think many audiophiles, myself included, are tempted to eschew them. But lately I've been having second thoughts about that attitude. I’m starting to wonder about the relative merits of the following two characteristics:
1. Accuracy to the recording
2. Accuracy to the recorded event
The relative merits of accuracy-to-the-recording vs. accuracy-to-the-recorded event has periodically occurred to me ever since, on the neutrality thread, Al wrote this:
12-02-09: Almarg
A perfectly accurate system…would be one that resolves everything that is fed into it, and reproduces what it resolves with complete neutrality. Another way of saying that is perhaps that what is reproduced at the listener's ears corresponds precisely to what is fed into the system.
Which does not necessarily make that system optimal in terms of transparency. Since the source material will essentially always deviate to some degree and in some manner from being precisely accurate relative to the original event, then it can be expected that some deviation from accuracy in the system may in many cases be complementary to the inaccuracies of the recording (at least subjectively), resulting in a greater transparency into the music than a more precisely accurate system would provide.
Which does not mean that the goals of accuracy and transparency are necessarily inconsistent or in conflict. It simply means, as I see it, that the correlation between them, although substantial, is less than perfect.
Al makes his point about accuracy in terms of neutrality vs. transparency, in keeping with the nomenclature of that thread, but it is essentially the same distinction as accuracy-to-the-recording vs. accuracy-to-the-recorded-event, or Recording Accuracy vs. Event Accuracy, for short.
I agree with Al that Recording Accuracy correlates with Event Accuracy, but not perfectly so. In other words, I now believe that efforts to maximize Recording Accuracy sometimes come at the expense of Event Accuracy, which is a viewpoint that, I suspect, more experienced audiophiles tend to adopt, but has taken me some time to appreciate. A turning point for me was an observation that Albert Porter made in an old digital vs. analog thread, which I read only recently:
09-12-08: Albertporter
The digital (or analog) master tape is not the issue here, the CD format is.
If any of you could hear a master digital tape (or hard drive) and compare that to CD or LP, you would realize how much we've been screwed. The problem with digital is when that great master is "moved" for public distribution…
Moving that master digital signal from one place to another and from one sample rate to another does it so much harm it cannot be repaired. Then to make matters worse, our only choice is an outdated format that's too low a sample rate to replicate what was on the master…
With CD, you get a severely downsampled format that's only a shadow of what could be if the format had evolved this last 25 years.
This observation resonated with me, as I have had the experience of recording, editing, and mixing with high quality professional equipment, to create a master recording I was proud of. I then watched - dismayed - as my master recording was compressed, downrez'd, and finally, transferred to its delivery format. Even on a very high quality playback system, the delivery format's recording was a shadow of its former self. Albert Porter’s observations about CD recordings undergoing this process of diminishment as a matter of routine procedure highlights the many respects in which the recordings available to consumers deviate dramatically from their master recordings, to say nothing of how the master recordings themselves deviate from the recorded events. Taken together, both deviations create a gulf between the live event and its consumer playback, a gulf that some audiophiles try to fill with ADDITIVE measures. And that brings me back to the point of this post...
It now seems to me that the use of ADDITIVE measures can be a means of filling, to whatever extent possible, the gulf between the live event and the (in many cases) extremely diminished recordings available to consumers. IMO, that provides a plausible rationale for sacrificing a small measure of Recording Accuracy for the sake of potentially greater Event Accuracy. Put another way, it provides a rationale for the ADDITIVE approach to playback.
Just which types of additions are the right ones is another matter entirely.
Bryon