Amplifier Input Impedance


Why are so many high-power solid state amps designed with such low input impedances. Doesn't that really low input impedance limit the range of pre amps that can be used? Are there technical reasons why designers make these impedances so low? Why not design your muscle amp with a really high input impedance so it will potentially work well with all pre amps?
stickman451
Thank you, Atmasphere. A crash course in amplfier design is clearly not going to work for me.
The input loading resistor to ground cannot effect gain.
But in solid state bi-polar input dc coupled amplifiers, if it's raised too much with an open circuit input will effect the amount of DC offset seen at the speaker terminals, which could be out of reach for any dc servo's to correct. But once something is plugged into the input, (eg preamp dac etc) the output impedance of that device becomes the input impedance of the amp, and all goes back to how it was with a smaller input loading resistor.

(as for what varies gain in an amp, it's the feedback loop and values of resistors used that raise and lower the feedback.)

So in a tube or solid state (with fet input) you can raise the input loading resistor.

But in a solid state with bi-polar input transistors you can also raise it within reason 68k or even 100k, but you should never turn it on without anything pluged into the input. Otherwise you run the risk of lots of dc going to the speakers.

Cheers George
George,

How does noise relate to your above description of input impedence, gain, etc if at all?

Lynne
If you use unshielded interconnects, then yes a high input impedance will be more susceptible to rf noise, but it's very unlikely you will hear anything, unless you live outside a taxi/cab rank, analogue cell tower, AM/FM radio tower or similar.
As we are at the low interference end of the system, it would be a different story if we were looking at the input impedance of say a phono stage or MC cartridge step-up device phono preamp.

Cheers George

Cheers George
Many people here seem to be overlooking the fact that the input to a conventional solid-state amplifier is AC coupled, and while any "termination" resistor after does affect AC input impedance, its primary purpose is to provide input bias current to the input transistor.
Finally, input impedance has nothing to do at all with DC offsets, and cannot introduce it.
Most textbooks on the subject of bipolar differential amplifiers discuss this thoroughly. Since the input bias current of a bipolar transistor varies with temperature, if the DC source impedances of each side (that is, the input side and the feedback side) are different, then the voltages developed as a result of the bias current are different, leading to an offset condition.

If a high value for the input bias resistor is desired, then the designer could raise the impedance of the feedback to match, and this would reduce offset drift, but then the noise would increase as a result of the Johnson noise developed on the feedback resistor ladder. He/she could reduce the standing current in the input pair to reduce the bias current, but this would dramatically reduce the slew rate and input-stage transconductance. (BTW insufficient input-stage current is is the true source of TIM - not global feedback.)

One could use FETs for the input stage to raise input impedance, but they have a lower transconductance than bipolars, and most of the good ones have voltage ratings a bit on the low side. Their higher impedance is offset at higher frequencies by higher capacitance, which can be reduced by cascoding, but this in turn introduces another HF pole in the input stage's response.

A designer could also add a servo to improve offset, but this is far from free, given the fact that it almost always requires lower-voltage supply rails from the rest of the amp. For that matter, he/she could also add an input buffer or balanced input stage . . . but again, there are more tradeoffs.

But the real question is, what kind of source impedance should an amplifier designer reasonably expect to see from the driving source? And in a world where only the wimpiest of preamplifiers have an issue with a 10K-50K load, how much extra cost and design effort is one willing to spend in order to satisfy a few oddball cases?