How could 100 Watt class a has more head room than a 300 Watt amp Class AB


Put aside which brand or make.
I put two amps into a test, both highend amp came from the same manufacturer.
Both double down the power with half of the impedance load, and THD is about the same.
Regardless of the size and cost difference, from a pure science perspective.
300 watt in theory should provide more headroom and sound ease when it reaches 100db, but the reverse is the true, the class A 100 watt seems to provide more headroom.
I have tried to use another set of speakers which is much easier to drive and it reaches the same conclusion.
Can someone explain why?
Quality or quantity of watt, how do we determined?
samnetw
ML is in the past, Gryphon is in the present. Bloody expensive stuff, though.
Post removed 

Quote
"At system-level you may think extra db from the average listening level the system can play before non-linearity/distortion, for example. This does involve the amplifier, but all other elements also including acoustics.

Headroom for amplifiers can mean extra power capability above rated, typically for shorter peak periods. But that's not all the possibilities for the meaning."

I think audiophile looking for realism of the music production. If you turn on the system as loud as the live performance without PA, you are much closer to the real thing.

To achieve this level of performance you need your system capable to produce the same level of db when you play a snare drum.
If you ever play a drum, it is difficult for you to turn down the volume, except you hit the drum pad lighter. 
Obviously, the better the system able to perform at the real db without distortion, I will called this a true high-end system.

The questions that I want to ask is that, speakers in general produce accurate timbre and db with minimal distortion tends to be low impedance, few examples come to mind is: Apogee, Maggie, Thiel, Revel, etc...

The best I have ever owned a system is Apogee with a pure class A monoblock 100 Watt
However, I sold the mono and get something newer provide 600 watt in stereo mode, but the size of a 100 watt amp is the same size as one of my mono.
The differences between both amps are huge. A blind test can easily distinguished between the two. 
My experience is that a true monoblock in pure class A is way better than 600 watt class a/b. 
To think about watt is not the measurement of an amp capability, then what are the things we should look for when we try to understand whether the amp is capable to provide sufficient current.
What spec of the amp will tell you that?
Someone mentioned about PassLab First Watt design, would I say the 20 watt first watt amp will outperform 100 watt class a/b amp?
What is the logic?
I'm with @tablejockey 

Not really convinced that your speakers ever even drew 100 watts.  

You can actually measure how much they're pulling at those volumes using this technique:  https://forum.audiogon.com/discussions/how-much-power-do-i-need-find-out-using-this-method
From what I've seen in the Stereophile reviews the Pass XA series amps are rated for their class A power and not peak power capabilities. It looks like they tend to double when run up to the limits of the power supply.

From the Stereophile XA60 review it'll do 130 watts into 8 ohms and keep going up as the impedance is lowered. It's just running in AB at that point. The A rating is probably limited by heat dissipation and the full power output by the power supply.