What am I missing? Pre/power gain and input sensitivity.


I’ve read a few posts about power amps with lower gain needing a pre with additional gain, but no one seems to mention input sensitivity in those conversations. If my source outputs 2V and my power amp input sensitivity is only 1.2V then whether the power amp gain is 16dB or 26dB my pre amp is attenuating the signal and the amount of gain on the pre doesn’t matter at all. With a given set of speakers, to get the same SPL with 2 different amps they should just need to output the same voltage, regardless of how they get there.

Am I missing something?

cat_doorman
See, here's the thing about the internet: zero editorial standards.  Anyone can post anything. On this site your odds of being read are actually higher the greater the BS. Our #1 has 21k posts and when you ask if someone can find even one of value the question is removed and the 21k grows to 22k. 

So yes you are missing something: the necessity of DYODD. And no, you are not missing something. They are.

@millercarbon - You probably need to add more fish to your diet, buddy. You’re starting to blather again.
@cat_doorman, in the case of the example you cited you are not missing anything. The main effect of different preamp gains would simply be that the volume control would be set to different positions for a given SPL.

Preamp gain can often be a significant consideration, though, in the case of phono sources (or tuners, tape decks, and other sources which typically have much lower output voltages than CD players and DACs).

For example, using some typical numbers, if a phono stage provides 40 db of gain for MM cartridges and 60 db of gain for LOMC cartridges, and the cartridges are rated to produce 5 mv (MM) or 0.5 mv (LOMC) under the standard test conditions, in each of those cases the phono stage will output only 0.5 volts when a recording is causing the cartridge to provide its rated output (which corresponds to a volume that is quite high). If substantial additional gain is not provided by a preamp in that situation much of the power capability of many power amps will not be able to be utilized on many recordings, even with the volume control at max.

Regards,
-- Al

The funny thing is, most digital sources due to Redbook standards put out too much voltage for most amps- IOW connecting them to a power amp will cause the amp to run at or near overload!


So the signal has to be knocked down, which is dumb IMO/IME. This requires a volume control and is why passive volume controls have become common. The problem is that most passive controls have an artifact which is to decrease impact at all frequencies if less than full volume. This problem is eliminated by buffering the control. So you'll need active circuitry in any event; it would have made more sense if the Redbook standard was set to 1 volt like tuners and consumer tape machines had been prior to digital.


I've noticed some confusion around this topic over the years so here are two things that are related to it:

the volume control **does not** say how loud you are playing the system. The signal level does.
Amplifier gain is different from amplifier power. Two amps can put out the same amount of power but one can have 25dB of gain and the other 35dB; obviously the one with the higher gain will play louder with the same signal level, but in the end both amps will only get as loud as their power allows.
See, here’s the thing about the internet: zero editorial standards

Some people need to prove this every day.

Think of gain as a voltage multiplier. 26 dB of amp gain ~ 20x the input voltage.

16 dB ~ 6x the input voltage.

Of course, every device has a fixed output voltage, set by the voltage rails of the power supply. This is one of the amp watt limits.

To drive an amplifier with 16 dB gain to the same level, you must provide 20/6 more volts, or 3.33 x the voltage.

This will require your pre to put out more voltage than normal, and also more noise.

Another way to think about this is that 1 Watt into 8 Ohms is 2.83 Volts.

At 26 dB, the input required is 0.141 Volts to create a 1 watt signal.

At 16 dB the input required is 0.47 Volts to create a 1 watt signal.

Best,
E