Gain and sensitivity are actually 2 ways of talking about the same thing. One can be calculated from the other.
Sensitivity tells you how much input voltage it takes to drive the amp to full power, 600 mV is actually on the low side as many amps take 1.5 (1500 mV) volts or so to drive them to full power.
Gain tells you how much a device multiplies the input voltage. It is expressed either as a ratio of output to input (Vout divided by Vin) or using a log scale of dB.
gain in dB = 20 log (Vout/Vin)
20dB of voltage gain is actually a factor of 10, not 100.
A preamp with 20dB of voltage gain will take a .2 V input signal and multiply it times 10 to make it 2 V if the volume is all the way up.
Your amp takes 600 mV to produce 16 watts.
It takes about 11.3V out of an amp to produce 16W into 8 ohms. (power = voltage squared divided by ohms)
Your amp has a gain of 11.3 divided by .6 = 18.8 or expressed in dB = 20 log 18.8 = 25.5dB
The confusion with the 2V output ratings is because sources and preamps are rated differently. The 2V from a source is the maximum level it can produce. The 2V spec from the preamp was with a specified input level with the volume all the way up and not the maximum that it can produce, which will usually be at least 10V and usually more.
If the source did hit a peak of 2 V and the volume was all the way up with a 20dB (times 10) preamp then it would try to put out 20V. Since your amp only needs .6V for full volume it would clip. For this reason most line level preamps are actually used with their volume controls less than maximum and are actually attenuating (reducing) the level of the signal. It's not as bad as it seems since the average output level of the source is much, mush less than the maximum it is capable of.
Sensitivity tells you how much input voltage it takes to drive the amp to full power, 600 mV is actually on the low side as many amps take 1.5 (1500 mV) volts or so to drive them to full power.
Gain tells you how much a device multiplies the input voltage. It is expressed either as a ratio of output to input (Vout divided by Vin) or using a log scale of dB.
gain in dB = 20 log (Vout/Vin)
20dB of voltage gain is actually a factor of 10, not 100.
A preamp with 20dB of voltage gain will take a .2 V input signal and multiply it times 10 to make it 2 V if the volume is all the way up.
Your amp takes 600 mV to produce 16 watts.
It takes about 11.3V out of an amp to produce 16W into 8 ohms. (power = voltage squared divided by ohms)
Your amp has a gain of 11.3 divided by .6 = 18.8 or expressed in dB = 20 log 18.8 = 25.5dB
The confusion with the 2V output ratings is because sources and preamps are rated differently. The 2V from a source is the maximum level it can produce. The 2V spec from the preamp was with a specified input level with the volume all the way up and not the maximum that it can produce, which will usually be at least 10V and usually more.
If the source did hit a peak of 2 V and the volume was all the way up with a 20dB (times 10) preamp then it would try to put out 20V. Since your amp only needs .6V for full volume it would clip. For this reason most line level preamps are actually used with their volume controls less than maximum and are actually attenuating (reducing) the level of the signal. It's not as bad as it seems since the average output level of the source is much, mush less than the maximum it is capable of.