Amplifiers do not put out power. They provide a voltage source, just like a wall outlet. The load (speaker, resistor, arc welder, whatever) draws current from the voltage source and if the source can maintain voltage the load will consume the power. If the voltage source cannot source the current, the voltage will sag and the power consumption of the load will decrease. This is why lights dim when an air conditioner kicks on, or why a backup generator slows down if it is too small for the appliances connected.
Same with an amplifier. In the first case, the amplifier is sourcing 40 volts into both the 4 and 8 ohm loads. Since the 4 ohm load draws twice the current and the power supply voltage does not sag, the power consumed doubles. In the second case, the amplifier sources 45 volts to the 8 ohm load but only 35 volts to the 4 ohm load. The transformer cannot source the current required of 45 volts into 4 ohms and the voltage sags to 35 volts, thereby limiting the power consumption to 300W. Since the sag may or may not be linear, the power at 5 ohms isn't necessarily an interpolation. But it is definitely less than 300 watts.
The class of the amplifier has nothing to do do with the current draw at the load. They are basically a measure of efficiency of power consumed at the load vs current draw from the wall. A Class A amplifier will draw (about) 4 times the current from the wall than the speaker from the amplifier, and all that extra current is dissipated as heat. The heat dissipated by the amplifier is technically the "power of the amplifier".