As a "generic" rule of thumb, Class A amps typically average about 40 - 50% efficiency. That means that you have 50 - 60% of the power being drawn at any given time being dissipated as heat.
As far as Class AB amps go, their efficiency levels will vary depending on how "richly" they are biased into Class A. Some AB amps might stay in Class A for a watt or two while others may switch over after 5 - 15 watts. As such, efficiency suffers so long as the amp is run at low level. Once the amp is pushed beyond the point of crossing over into Class B, efficiency rises somewhat. The harder the amp is driven, the more efficient it becomes ( in theory ). AB amps are typically considered to be about 65% efficient in terms of power drawn vs actual power output. As mentioned, this figure can go up or down as a whole depending on the overall bias level.
As far as switching or "digital" amps go, efficiency levels can be VERY high. Some of this will vary with how the driver circuits are set up to operate. Since minimal power is lost in most switching designs, there is a minimal amount of heat build-up within the amp itself. This is due to the fact that the circuitry is only "active" a very small percentage of time ( greatly reduced duty cycle ). The drawback to this gain in efficiency is that one runs into a massive increase in several different types of distortion and "typically" a loss of resolution ( especially at lower power levels ). The more that you "pile drive" a switching amp, the less noticeable the side effects will become.
As such, the hotter an amp runs, the less efficient it is. Having said that, it is "probably" also more "linear" than an amp of lower bias all things being equal. Getting all of the variables "equal" is a whole 'nother ball of wax though... Sean
>