Most good quality amps specify output as RMS watts over a particular frequency range at particular distortion rates with all channels driven simultaneously. It used to be common for mass marketed AVRs to use a less stringent specification of wattage, I think they may have called it something like music output, but I don't recall. So, watts may not be watts in specifications depending on how they're defined. The wattage available for a 1 kHz signal one channel at a time will be very different than that for 10-100,000 Hz all channels driven simultaneously. Then there is distortion, which typically rises rapidly above the level of the output specification. And then there's gain: In the days when I was concerned with tube amps, an MC 60 could deliver its 60 watts with a 0.5 volt input, whereas a Dynaco required a 1.5 volt input to deliver 60 watts. Wattage output is not the same as gain, and what amps provide is gain.
db