There is one maker of a well-regarded tube power amp (I’ll allow he and it to remain anonymous, so as to not appear to be "pushing" them) who offers the amp in both Class-A/B form, and in pure Class-A. Both versions share the same circuit architecture, power supply, tube compliment, etc, differing only in the biasing required to create each version.
The A/B version produces 100w into 4 and 8 ohms, the A version 40w. So why would anyone go with the 40w version? The designer claims (and there’s no reason to doubt him, at least in my mind) the Class-A version produces 1/10th the distortion of the A/B version. How audible is a tenfold decrease in measured distortion? I can’t answer that question.