How could 100 Watt class a has more head room than a 300 Watt amp Class AB


Put aside which brand or make.
I put two amps into a test, both highend amp came from the same manufacturer.
Both double down the power with half of the impedance load, and THD is about the same.
Regardless of the size and cost difference, from a pure science perspective.
300 watt in theory should provide more headroom and sound ease when it reaches 100db, but the reverse is the true, the class A 100 watt seems to provide more headroom.
I have tried to use another set of speakers which is much easier to drive and it reaches the same conclusion.
Can someone explain why?
Quality or quantity of watt, how do we determined?
samnetw
Post removed 
How could 100 Watt class a has more head room than a 300 Watt amp Class AB

If one were a Mosfet output and the other a BJT (bi-polar) and both were well engineered push/pull and driving into a heavy load, such as the Wilson Alexia which has an EPDR (equivalent peak dissipation resistance) of just .9ohm!!!! at 65hz.
Then the 100w BJT amp would drive it better than the 300w Mosfet.
As the 100w BJT "could" in effect give out 800w into that .9ohm load where the Mosfet would probably go down to less than it’s rated 8ohm 300w output.
Cheers George
Sounds like you are talking about two SS amps, so probably from Pass but that is just a guess.
To your question, the Class A amp will likely have a stiffer power supply and more capacitance.  Also, IME, I like the sound of Class A better since to me they sound more fleshed out than the best AB amps I have heard and so may sound more harmonically complete when pushed.  A last observation based on experience is that really good Class A amps do not get strident sounding as Class AB amps can when they are starting to get stressed, the Class A amps simply run out of steam - at least that is how the two different sets of Lamm hybrid monos I owned used to behave.  My 300 wpc Class A Claytons have yet to run out of steam, even driving my new lower-efficiency (85dB) speakers.
To George's response, my listening has resulted in a preference for bi-polar output stages vs. mosfets.  I believe some designers use mosfets to try and emulate a "tube sound" but to me they do not control the output as well.
There are a few examples...So as far as Headroom.... First an 100 watt amp that is true pure Class A amplifier won't play louder than a 300 watt A/B amp... Pure Class A means that it will not switch out of Class A from First Watt until clipping.... So the amp is a Class A amplifier... In this sense,  A  100 watt vs a 300 watt amp is a no brainer as far as how loud it will play...The 300 watt will have 6db more headroom. On the other hand,  there are several amps today that are sliding Class A,  They are biased to run Class A for a percentage of their output.  Threshold and Coda did this quite a bit.... I think of the Aleph 3 or 30,  they are rated at 30 watts class a, but in reality are more like 125 watts or so,  but they play their first 30 watts in Class A before switching to A/B, after 30 watts they play the balance of their power in A/B. 
I hope this helps,  Tim
Not having any credentials such as a manufacturer or scientist, I always refer to the basics.For me it adds even more confusion, because we're talking about a music signal heard ultimately, by ears.

http://www.electronics-tutorials.ws/amplifier/amplifier-classes.html

I took a couple of analog/digital classes in the early 80s. An instructor that was an audiophile would blow the students minds  with endless physics/science  theory behind audio. It was way over my head!

I did become aware that most of us really are listening to just a couple watts of "linear "power.