Amplifier Input Impedance


Why are so many high-power solid state amps designed with such low input impedances. Doesn't that really low input impedance limit the range of pre amps that can be used? Are there technical reasons why designers make these impedances so low? Why not design your muscle amp with a really high input impedance so it will potentially work well with all pre amps?
stickman451
George,

How does noise relate to your above description of input impedence, gain, etc if at all?

Lynne
If you use unshielded interconnects, then yes a high input impedance will be more susceptible to rf noise, but it's very unlikely you will hear anything, unless you live outside a taxi/cab rank, analogue cell tower, AM/FM radio tower or similar.
As we are at the low interference end of the system, it would be a different story if we were looking at the input impedance of say a phono stage or MC cartridge step-up device phono preamp.

Cheers George

Cheers George
Many people here seem to be overlooking the fact that the input to a conventional solid-state amplifier is AC coupled, and while any "termination" resistor after does affect AC input impedance, its primary purpose is to provide input bias current to the input transistor.
Finally, input impedance has nothing to do at all with DC offsets, and cannot introduce it.
Most textbooks on the subject of bipolar differential amplifiers discuss this thoroughly. Since the input bias current of a bipolar transistor varies with temperature, if the DC source impedances of each side (that is, the input side and the feedback side) are different, then the voltages developed as a result of the bias current are different, leading to an offset condition.

If a high value for the input bias resistor is desired, then the designer could raise the impedance of the feedback to match, and this would reduce offset drift, but then the noise would increase as a result of the Johnson noise developed on the feedback resistor ladder. He/she could reduce the standing current in the input pair to reduce the bias current, but this would dramatically reduce the slew rate and input-stage transconductance. (BTW insufficient input-stage current is is the true source of TIM - not global feedback.)

One could use FETs for the input stage to raise input impedance, but they have a lower transconductance than bipolars, and most of the good ones have voltage ratings a bit on the low side. Their higher impedance is offset at higher frequencies by higher capacitance, which can be reduced by cascoding, but this in turn introduces another HF pole in the input stage's response.

A designer could also add a servo to improve offset, but this is far from free, given the fact that it almost always requires lower-voltage supply rails from the rest of the amp. For that matter, he/she could also add an input buffer or balanced input stage . . . but again, there are more tradeoffs.

But the real question is, what kind of source impedance should an amplifier designer reasonably expect to see from the driving source? And in a world where only the wimpiest of preamplifiers have an issue with a 10K-50K load, how much extra cost and design effort is one willing to spend in order to satisfy a few oddball cases?
John (Jmcgrogan2) & Bombaywalla, see what I meant when I said that "I don't have a good feel for what the inevitable tradeoffs would be." :-)

Thanks, Kirk. Good to see you here again.

Best regards,
-- Al
But the real question is, what kind of source impedance should an amplifier designer reasonably expect to see from the driving source? And in a world where only the wimpiest of preamplifiers have an issue with a 10K-50K load, how much extra cost and design effort is one willing to spend in order to satisfy a few oddball cases?

^^ This.