I should add, you probably should listen to the combination.
The worst that happens is a loss of highs at the extremes of the volume knob.
You may not care, or even like it. :)
Best,
E
|
Almarg, your assistance is needed... |
I was going to guess -- "noise". Its not how i design, but its legitimate. It may also be delectable or easily changed. Often its just a resistor.
|
Thanks for the mention, John (Roxy54). Following are some excerpts from this thread, in which the same question was discussed several years ago: Almarg 2-11-2014
... having never designed an audio power amplifier, I can't speak
knowledgeably about what the tradeoffs would be if a solid state one
were designed with a high input impedance. Certainly it's readily
doable, but I don't have a good feel for what the inevitable tradeoffs
would be.
In addition to those tradeoffs ... I don't doubt that in many cases a significant factor
is a lack of motivation to provide compatibility with tube preamps.
Kirkus 2-13-2014
... Since the input bias current of a bipolar transistor varies with
temperature, if the DC source impedances of each side (that is, the
input side and the feedback side) are different, then the voltages
developed as a result of the bias current are different, leading to an
offset condition.
If a high value for the input bias resistor is
desired, then the designer could raise the impedance of the feedback to
match, and this would reduce offset drift, but then the noise would
increase as a result of the Johnson noise developed on the feedback
resistor ladder. He/she could reduce the standing current in the input
pair to reduce the bias current, but this would dramatically reduce the
slew rate and input-stage transconductance. (BTW insufficient
input-stage current is the true source of TIM - not global feedback.)
One
could use FETs for the input stage to raise input impedance, but they
have a lower transconductance than bipolars, and most of the good ones
have voltage ratings a bit on the low side. Their higher impedance is
offset at higher frequencies by higher capacitance, which can be reduced
by cascoding, but this in turn introduces another HF pole in the input
stage's response.
A designer could also add a servo to improve
offset, but this is far from free, given the fact that it almost always
requires lower-voltage supply rails from the rest of the amp. For that
matter, he/she could also add an input buffer or balanced input stage . .
. but again, there are more tradeoffs.
But the real question is,
what kind of source impedance should an amplifier designer reasonably
expect to see from the driving source? And in a world where only the
wimpiest of preamplifiers have an issue with a 10K-50K load, how much
extra cost and design effort is one willing to spend in order to satisfy
a few oddball cases?
Almarg 2-13-2014
...
see what I meant when I said that "I don't have a good feel for what the inevitable tradeoffs would be." :-) Regards, -- Al
|
I noticed that at least some SMcAudio-revised McCormack amps have much lower input impedance compared to the stock models. I don’t know what specifically Steve does in the revisions that produces this, but given the results he clearly feels it’s a worthwhile tradeoff for improved sonics. But given Steve’s practical nature and Kirkus’s point about limiting the sonic benefit for 95% of audiophiles to accommodate a few outliers, the choice for a lower input impedance seems like pretty reasonable one in the scheme of things. No?
As a stupid follow-up question, does increasing the bias in a SS amp have an effect on input impedance? Or, put another way, could you increase the operating bias in an amp without raising the input impedance, and if you did that what would be the sonic trade offs? Of course there are Class-A amps that seem to have reasonably low input impedances, so maybe I answered my own question. Yes I know, I’m a moron. |