Impedance match


My understanding for impedance matching a preamp or upstream source having a volume control (e.g., a DAC with volume control) to a downstream amplifier is that an appropriate “impedance match” is for the amplifier to have an input impedance at least 20-fold to 50-fold greater that the output impedance of the upstream preamp or source output impedance. One dealer told me that the appropriate “impedance match” between to such components is exactly that—an identical match of the same impedances. 

So which is it?
128x128celander
 
noble100

    I always read and was taught by members here that the guideline was at least 10x but will work well together with anything greater, too. Identical match is definitely not correct.

With source to pre, pre to amp.
Once you have a "Output to Input" impedance ratio of say 1:10 or more, your fine.
 
We had a large participant demo at our audio society meeting, about 35-40 "golden ear'ed" audiophiles were present.
I designed a switchable on the fly input impedance changer on an excellent amp that was in a very good system.
 
This impedance changer change the "O/I" impedance ratio in 20 increments from 1:100 down to 1:5. It was at 1:5 that only two "super golden ear'ed" audiophiles "thought" they could hear a difference for the worse, but they also both said they probably couldn't pick it in a blind A/B.
All said they heard a difference at 1:3.
AC levels were checked to the mV for all ratios to be the same.

Cheers George      
georgehifi,

     Interesting experiment.  You used slightly different phrasing, but thanks for validating the general impedance matching guideline of the amp input impedance being at least 10x the output impedance of the preamp or source component (if the source is connected directly to the amp) or, in your phrasing,  an output /input ratio of 1:10.

Tim

Yes when the output impedance to input impedance (O/I)  ratio got down to 1:5 then a couple listeners said they could hear the dynamics may have been getting a little effected, but they said it was very small and don't think they could of blind A/B'd it.
But most of the 40 odd listeners said they could just detect 1:3 ratio. So a safe bet would be say 1:6 or higher.

Cheers George 
Thanks for the excellent input, George.

An additional point everyone should be aware of is that how objectionable a given low ratio is likely to be depends not only on the ratio itself but also on how much variation that ratio has over the frequency range.

For example, if the ratio is say 3:1 at the worst case frequency (i.e., at the frequency at which the ratio is lowest), but the impedances that are involved don’t vary much over the frequency range, and hence the ratio doesn’t vary much over the frequency range, the consequences would be a slight and inconsequential reduction in gain; an increased sensitivity to cable effects (especially if the low ratio is due mainly to a high output impedance of the component providing the signal); and perhaps a small degradation in the distortion performance of the component providing the signal.

However if the low ratio involves the kind of output impedance characteristic that occurs in the case of many tube-based components, where the output impedance may be a few hundred ohms in most of the spectrum but may rise in the deep bass region to a few thousand ohms at 20 Hz, the consequences of that same 3:1 ratio (at 20 Hz in this case) will include significant deep bass rolloff as well as frequency-dependent phase shifts in the bass region. Which are likely to be much more noticeable and objectionable than the effects described in the preceding paragraph.

Or putting it all another way, 3:1 may be fine in some cases, while 8:1 may be unacceptable in some cases, depending on how the impedances and consequently those ratios vary over the frequency range.

Regards,
-- Al