Are passive preamps better?


Does a passive preamp with transformers so that its impedence can be matched with an amplifier have the potential to provide better sonics than a line preamp? I have a Simaudio Celeste preamp and a Harman Kardon Citation 7.1 amplifier. Lynne
arnettpartners
Lynne:
Pubul57, Why does an amp with a low imput impedence affect the frequency response by increasing the mid's? Why doesn't it simply drive the pre into distortion?
P didn't say that -- he pointed to losses in low freq and very high freq.
Low input impedance means a lot of energy is required from the source component (the component before the amp) to drive the amp correctly. A "passive" pre doesn't provide energy -- it just attenuates it.
So, the task falls on the preceding component -- say, the cdp.

Overall, if the system runs out of steam trying to drive the amp... it's distorting.
(BTW, are you interested in/do you understand, how impedance works?)
If I can speaker to some issues not previously addressed:

There are four functions a line stage has. They are:

1) supply any missing gain
2) provide volume control and input selection
3) provide buffering of the volume control from external impedances
4) (and least understood) control the interconnect cable.

Passive systems provide only #2. Most line stages provide 1-3. If that is all they can do, it will be likely that there will be tradeoffs with a passive, perhaps in the passive's favor. If, OTOH, the preamp is capable of all four functions, then it is likely that the active linestage will be superior.

This is because the interconnect cable plays a serious role in the system. As any passive owner can tell you, the passive sounds better turned up rather than turned down. This is because the passive cannot control the interconnect. At the extreme opposite, a preamp that *does* control the interconnect will be found to be immune to the type of cable and its length.

The unfortunate thing is that you can count the number of such preamps in the high end audio community on your hands with fingers left over, because most preamp designers do not acknowledge the 4th function.

Of course, different line stages exhibit different levels of competence. This definitely muddies the waters somewhat!

This is a good part of why there is a divergence of opinion. There would be none if everyone could hear a competent linestage that can control the interconnect cable, but even that is not likely so this debate will continue.
Atmasphere, can you please elaborate on the linestage "controlling" the interconnect cables. I have not yet heard a linestage that is "immune" to cable choice.
Tbg, thus my final comment on my post!

Several decades back, this problem was addressed by the recording/broadcast industry, for pretty much the same reasons that audiophiles deal with today. The result was the balanced line system, which is actually a standard.

The standard requires that the source (preamp) be able to drive a 600 ohm load. There are several practical reasons for this, not the least of which is that the low output impedance of the source thus has the ability to 'swamp out' the effects of the cable caused by capacitance and other construction issues. The result was twofold: not only can a preamp that supports the 600 ohm balanced line standard control the interconnect so well that essentially the quality of the cable has little bearing on the sound, but also the length of the cable became all but irrelevant as well.

The number of preamps in the high end audio world that support the 600 ohm standard without any sonic artifact (loss of bass and/or dynamics) are very few. Some use output transformers, and that is why the termination standard is 600 ohms, so that the transformer can drive a reasonable load without ringing. If there is no transformer, then the termination is less important.

If you have ever wondered why 'audio engineers' say that the interconnect cables make no difference in the sound, this is why: in their world it is true because they use low impedance balanced lines. Audiophiles can take advantage of this though, as low impedance balanced lines offer the same advantages to them- the standard was in fact created to solve the sonic artifact issues that audiophiles routinely experience.
Atmasphere, I do recall many saying that in professional applications using balanced cable the wire is unimportant. I once had the Cello system using Cello Strings and all in balance. We found to our consternation that Siltech balanced cables sounded better than Cello Strings. I have no idea whether the Cello Suite was able to drive into a 600 ohm load, but I suspect that was a design goal. Does this not suggest that there were still sonic differences? I remember thinking that I could not afford to use all Siltech.