HI geek,
OK, but can you explain how this increased power output happens? To get more power you need more power supply voltage and current no matter which tube is being discussed. We can't argue about this. First, did you use a good distortion analyzer to monitor the amps saturation state? Or, how did you monitor distortion? I use a Tektronics analyzer which goes down to .0001% THD and a Tektronix audio frequency sine wave generator which outputs at .00001 THD. Those are real specs. Great test equipment and ideal for this type of testing. They got a lot of use in my repair shop.
Typically, any power amp will use the power available from the supply to establish its limits. But, not always. The exception is if the power supply has extra head room and other circuit factors could then could determine when the amp goes into saturation where distortion increases rapidly. This is not the case with my D250. The power supply is the limiting factor as it is with most power amps including those made by ARC.
I did check with ARC about the KT-120's. They made the point strongy that the KT-120's will not, for example, increase the output power of my D250 Mk II unlesss the power supply voltages and current capacity are increased. I already knew this so it was confirmation. Of course, changing the power supply voltages and current is nearly impossible without a complete redesign. So, for most amps (all that I know of) the advantage of substituting the KT-120's for 6550's is possibly (yet to be proven) increased tube life, not increased power.
BTW, ARC will continue to use 6550's for power supply regulators. I'm sure you know this.
So, please explain. This is important because the 6550 owners of the world are going nuts over this new tube for, I think, the wrong reasons. There are no free Watt's in my world.
Thanks, Sparky