Why do amps sound different?


Hi folks, can anyone tell me why amps sound different? I know this is a very trivial question, but it isn't so trivial as I previously thought. For example: an amp can sound "warm", while the other can sound "lean" and a bit "cooler". These amps measure the same on the test bench, but why do they sound different? What causes the "warm" characteristic if the amp has pretty good measurements and frequency characteristics? It is certainly not measurable high frequency roll off, otherwise the amp sucks. Maybe one of the experts among us can elucidate this issue a bit. Thank you.

Chris
dazzdax
Atmasphere, you are correct, that distinction has been clearly made at the link you provided. Thankyou, I found it very interesting and informative!
While I enjoy and value Atmasphere's posts on the subject, I will take issue with the major point in the paper he presented. I don't see that these two paradigms exist at all . . . except in a hypothetical world where there is a simple, binary choice in available loudspeakers: Apogees and Lowthers.

If you look at the symbiotic evolution of amplifier and speaker designs over the past eighty years or so, it's commonly accepted that an increasing abundance of amplifier power enabled loudspeaker designers to trade efficiency for other factors, such as smaller cabinet size and improved linearity. But it has been the loudspeaker designers that have, in turn, been consistently demanding more "current impervious" performance from the amplifiers. This is why the hallowed amplifier designs of the pre-war era were triode designs: yes, for linearity, but just as importantly, for lower output impedance. Even an Altec VOT system and an Altec 604 duplex monitor would have presented very different impedance curves to the amplifier. And in either case, a flat frequency response from a linear amplifier was highly desired.

Even seventy years ago, loudspeaker designers were working with a voltage-source model, not a current-source model. While the reasons for it are my own speculation, they seem pretty obvious. First, high-frequency transducers almost always have a huge efficiency advantage over low-frequency ones. Second, advances in transducer technology are mostly advances in materials (diaphragm materials and suspensions, magnetic materials), and mathematical modeling (horns and lenses). Designing loudspeakers and crossovers to effectively take advantage of what the transducers have to offer is extraordinarily easier, and achieves better results, when working from a voltage-source model.

The presence/absence of multiple impedance taps on amplifiers, for this discussion, is a non-sequitur. If one wanted to design a conventional transformer-coupled tube amp that put out 50 watts into 16 ohms, 100 watts into 8 ohms, 200 watts into 4 ohms, etc. from a single output tap, it could be done . . . there would simply be huge tradeoffs in terms of efficiency and performance into a given impedance. Very similar tradeoffs also exist in solid-state amplifier design . . . the difference is one of cost and benefit. If you already have an output transformer, then adding additional taps usually makes sense. If you don't . . . then it's of course bit harder and costlier.

My point is that there really is no "Current Paradigm". The interface between high-fidelity amplifiers and their respective speaker systems have ALWAYS been based on a voltage model. (The term "high-fidelity" is meant to simplify the discussion by excluding things such as field-coil speakers and 70V distribution systems, not a snub to anybody's amplifier design.) And high-fidelity amplifiers have always been expected to have reasonably "current impervious" operation. What "reasonably" means in absolute terms is a debate that has been around many years longer than solid-state amplifiers . . . but if an amplifier's output is intended for a "4-ohm" load, then I would expect it to be fairly "current impervious" over the range of current that a "nominal 4-ohm" loudspeaker would require, plus some extra for good measure. Most good conventional tube amps achieve this.

I maintain that a high output impedance, for a high-fidelity audio power amplifier, is ALWAYS a liability, period. Now it may be that some of these amplifiers have other performance aspects that outweigh it, and some speakers are tolerant of it (and a few even subjectively improved). But this idea that there's one branch of the speaker-design profession that optimizes their products to work with amplifiers that have high output impedances? I don't buy it. If there is, then exactly what is the output impedance that they're expecting?
Kirkus, How did the loudspeaker designers gain enough leverage to make demands on amplifier designers?
Well, Cyclonicman, legend has it that James Lansing, immediately prior to his untimely death, wrapped a piece of Alnico V in a largish bath towel and "went postal" on the electronics staff at Altec . . .

But seriously, they did it by designing speakers that people wanted to buy, and that were more demanding loads for the amplifier. 40 years ago, virtually all amplifiers had 16-ohm output taps, and today, an amplifier's performance into a 16-ohm load isn't even a footnote. I guessing this is because, er, how many modern 16-ohm hi-fi speakers can you think of?

A great example is the Apogee full-range ribbons I alluded to. The two things that people remember about them are that they sounded amazing, and that they blew up amps. I have heard from a few sources about how these loudspeakers influenced Mark Levinson's amplifier designs . . . I'm not so sure that the timeline works out for that to be true, but the Apogees definately had a huge influence on the current output capability of "flagship" solid-state amps of the 1980s and 1990s.