Class A to XX Watts


I've only used pure Class A (Aleph J), never an amp that started out in Class A for the first few watts only. Wouldn't the transition out of Class A be audible? Seems to me that if it is then that would be a bad thing. If it isn't audible, then why would anyone design it that way to begin with (marketing?)?
ddd1
Pubul57, unto itself that might appear to be the case, but speakers are another issue unto themselves. The net result of the interaction between them is not necessarily as it might so appear. Speakers that are less efficient aren't so for the sake of it, but for advantages elsewhere.
very true. ahh, the tradeoffs. but i do think the A/B tradeoff maybe, as you, well worth it in a given system.
Depending on the speakers and listening levels a high biased AB amplifier may never switch out of class A.
Class A biasing is beneficial when it comes to sound quality, but there are many, many other things that it affects, most notably reliability: constant thermal cycling leads to failures. Biasing is a thin veneer spec that has just a fraction to do with the overall sound and character of an amplifier - it's a detail, not a foundation.

My Boulder 1060 operates up to 17 watts in Class A before switching to Class AB, however Boulder spent a lot of time perfecting the circuit design & notching out crossover distortion. The Rowland 625 for example runs in Class A up to a higher power output level, although in reality this means very little - the biasing scheme used to keep an amplifier in a particular mode of operation can't make up for for an overall amplifier design that's not of the same calibre and may actually be necessary to compensate for non-linearities in operation. In other words, if you start out with a much better circuit design to begin with, you don't need to over-compensate with higher biasing.
Over the years of owning various classes of amplifiers I have come to realize that excellence in amplifier power supply design is far more important than cross over distortion.