Many people here seem to be overlooking the fact that the input to a conventional solid-state amplifier is AC coupled, and while any "termination" resistor after does affect AC input impedance, its primary purpose is to provide input bias current to the input transistor.
If a high value for the input bias resistor is desired, then the designer could raise the impedance of the feedback to match, and this would reduce offset drift, but then the noise would increase as a result of the Johnson noise developed on the feedback resistor ladder. He/she could reduce the standing current in the input pair to reduce the bias current, but this would dramatically reduce the slew rate and input-stage transconductance. (BTW insufficient input-stage current is is the true source of TIM - not global feedback.)
One could use FETs for the input stage to raise input impedance, but they have a lower transconductance than bipolars, and most of the good ones have voltage ratings a bit on the low side. Their higher impedance is offset at higher frequencies by higher capacitance, which can be reduced by cascoding, but this in turn introduces another HF pole in the input stage's response.
A designer could also add a servo to improve offset, but this is far from free, given the fact that it almost always requires lower-voltage supply rails from the rest of the amp. For that matter, he/she could also add an input buffer or balanced input stage . . . but again, there are more tradeoffs.
But the real question is, what kind of source impedance should an amplifier designer reasonably expect to see from the driving source? And in a world where only the wimpiest of preamplifiers have an issue with a 10K-50K load, how much extra cost and design effort is one willing to spend in order to satisfy a few oddball cases?
Finally, input impedance has nothing to do at all with DC offsets, and cannot introduce it.Most textbooks on the subject of bipolar differential amplifiers discuss this thoroughly. Since the input bias current of a bipolar transistor varies with temperature, if the DC source impedances of each side (that is, the input side and the feedback side) are different, then the voltages developed as a result of the bias current are different, leading to an offset condition.
If a high value for the input bias resistor is desired, then the designer could raise the impedance of the feedback to match, and this would reduce offset drift, but then the noise would increase as a result of the Johnson noise developed on the feedback resistor ladder. He/she could reduce the standing current in the input pair to reduce the bias current, but this would dramatically reduce the slew rate and input-stage transconductance. (BTW insufficient input-stage current is is the true source of TIM - not global feedback.)
One could use FETs for the input stage to raise input impedance, but they have a lower transconductance than bipolars, and most of the good ones have voltage ratings a bit on the low side. Their higher impedance is offset at higher frequencies by higher capacitance, which can be reduced by cascoding, but this in turn introduces another HF pole in the input stage's response.
A designer could also add a servo to improve offset, but this is far from free, given the fact that it almost always requires lower-voltage supply rails from the rest of the amp. For that matter, he/she could also add an input buffer or balanced input stage . . . but again, there are more tradeoffs.
But the real question is, what kind of source impedance should an amplifier designer reasonably expect to see from the driving source? And in a world where only the wimpiest of preamplifiers have an issue with a 10K-50K load, how much extra cost and design effort is one willing to spend in order to satisfy a few oddball cases?