Can the generally published spec for amplifiers’ input impedance, e.g., 47K ohms for the Vincent, be simply measured at the input ports, presumably with the amplifier turned on, or there’s a lot more involved in taking this measurement? Thanks again for all the education.No, in general I don’t think that measurement could or should be performed with a simple multimeter, Kalali, if that is what you are asking. For several reasons. First, the input impedance at the zero Hz (DC) frequency that is put out by the meter might be much different than the input impedance within the audio band. The input impedance might even be essentially infinite at 0 Hz if a coupling transformer or coupling capacitor is present, while being far lower at audio frequencies. Also, depending on the specific design of the amp and the meter the amp might be over-driven. And if a resistance scale is chosen that would result in a suitably low test voltage being applied, I’d imagine that in many cases a meaningful reading would not be obtainable.
A good way to perform such a measurement, which I suspect is how John Atkinson does it for Stereophile, would be to make use of an audio frequency signal generator whose output impedance and output voltage can be varied in a controlled manner. When a suitable voltage is applied and the output impedance of the signal generator is adjusted such that the unit under test loads down the output of the generator to half the amplitude that is present when its output impedance is set close to zero, that setting of the signal generator’s output impedance would be equal to the input impedance of the device under test, at the frequency that is being generated.
Best regards,
-- Al