Conventional measurement data does not give one ANY meaningful information about the sound of the amplifier (except, perhaps output power). Any company reporting specifications will have an amp whose distortion and noise lie well below the supposed threshold of audibility. Does it matter if an amp has .02% harmonic distortion vs. another amp with half that measure? If, as a subjectivist, you insist on evaluating based on measurements, you have to also accept the extensive research and testing that shows that humans are unable to detect pretty high levels of harmonic distortion, particularly low order harmonic distortion. Levels like 10% or more of 2nd order harmonic distortion is undetectable at certain audible frequencies.
It would be easy for the measurement crowd to win the argument by simply doing an experiment that shows a statistical correlation between measured distortion at the levels seen in consumer amps and either listener preference or even listener ability to distinguish between two amplifiers. i have not seen that demonstrated.