10-24-14: CzariveyI'm not sure this is correct. For preamps and line stages, when input sensitivity is specified it usually represents (with the volume control set at max) the input voltage required to drive the output to an indicated level, such as 1 volt, that is representative of a realistic operating condition. Not the input voltage required to drive the output to its specified maximum. The specified maximum is the maximum output voltage the preamp or line stage is capable of providing without clipping or gross distortion, and would usually and hopefully be a far higher voltage than would ever occur under normal operating conditions.
Manufacturers instead of gain would often provide you input sensitivity and max output voltage. 2 ways can be worked around this problem:
1. Contacting manufacturer about gain info
2. Use following formula to calculate gain:
a) Max_Voltage * 0.707 = RMS Voltage
b) 20Log(RMS Voltage/Input Voltage)
You may be thinking of power amps, where the specified input sensitivity usually represents the input voltage required to drive the amp to its rated maximum output power.
Also, I believe that maximum output voltage specs for preamps and line stages are usually expressed in rms terms, not peak, so the 0.707 factor would usually not be applicable.
Regards,
-- Al