1.2k Output Impedance to 10k Input Impedance


I have a tube preamp with only single ended outs that does not have any output buffers and therefore a fairly high output impedance of 1.2k. My SS power amp has input impedances of 10k on RCA or 20k on XLR.

I have been advised by the designer of the preamp that anything lower than 50k or so will start loading down the line stage, so it would seem an imperfect pairing. My assumption is that at the very least I’d want to use the XLR inputs on the amp, as they present a 20k load to the preamp instead of the 10k vis the RCA inputs.

I have an RCA-XLR interconnect, but I’ve been told that simply using this will not result in a 20k load on the preamp and that the only way to accomplish this would be to use a passive transformer in line, such as the Jensen ISO-Max DM2-2RX.

For those that have experimented with these devices, do they result in any sonic degradation?

I'm wondering which might be the lesser evil – introducing an additional circuit and set of connections into the signal path via the DM2-2RX transformer, or running with a less than ideal impedance match between preamp and power amp. The answer is, of course, to try both and see, I'm just wondering if anyone has gone down this path before and what they found (heard).

Any inputs would be greatly appreciated.
128x128srosenberg
I thought the general rule of thumb on impedance matches was that the amp should be atleast 10 times that of the pre. You are just within that envelope, so why not just do it and see (hear)? Might not be perfect, but probably no harm done. Anything patched between them is likely to have a negative audible effect anyway. Others, please correct me if I am wrong here.
recommendations seem to vary here - i've read anywhere from 5:1 to 50:1 as a minimum, though most often 10:1 seems to be cited.

while i am just within the 10:1 guideline going RCA into my amp, the preamp manufacturer specifically stated that anything less than 50:1 would compromise the performance of the line stage, the impact increasing with the load on the preamp. hence my thought to convert to XLR.

If it matters, the preamp is a CounterPoint SA-5.1 and the poweramp is a McIntosh Mc252.
sorry for the typo,
"anything less than 50:1 would compromise the performance of the line stage..."

should have read,
"anything less than 50k would compromise the performance of the line stage..."
Hi Scott,

IMO, the impedance matching guideline is properly stated as follows (I'm quoting from myself in another thread):
Ideally the input impedance of the power amp should be at least ten times greater than the output impedance of the preamp, at the frequency within the audible range for which the output impedance of the preamp is highest. That frequency will usually be 20 Hz, as a result of the impedance rise caused at low frequencies by the output coupling capacitor most tube preamps use.

A factor somewhat less than 10x may or may not be acceptable, depending mainly on how the preamp's output impedance varies as a function of frequency, and also on the deep bass extension of the speakers.
The fact that the designer suggests 50K minimum tells me that the preamp's output impedance probably varies considerably as a function of frequency. And that it is much higher at some frequencies, probably including 20 Hz, than the 1.2K nominal figure (which I suspect is at 1 kHz).

The 20K balanced input impedance of the amp probably represents the sum of 10K input impedances on each of the two legs (i.e., on each of the two signal inputs that comprise the balanced signal pair). So using an RCA-to-XLR cable will not help, as you were advised.

Concerning the Jensen transformers, member Mitch2 has an excellent system and has spoken very enthusiastically in past threads about the results he has obtained with them. Atmasphere, on the other hand, has commented that in his experience they do have at least some slight artifacts.

Best regards,
-- Al