OK let me try to clarify few things in my mind.
I assume every amplifier manufacturer gives the input sensitivity as per watt not at full power.
Let's say there are two amps.
Amp A:
Input sensitivity: 500mv
Input impedance: 100kOhm
Power output: 40 watts in 8 ohms
Amp B:
Input sensitivity: 100mv
Input impedance: 30kOhm
Power output: 100 watts in 8 ohms
So Amp A should be 500mv x 40 watts =20000mv/2v
Amp B should be 250mv x 100 watts = 25000mv/2.5v
Some amp manufacturers write 1.5v for input sensitivity in their specs. Does this still mean per watt? Because if so then at full power it is crazy high. Say if they mean 1.5v per watt and its output is 80 wpc. Then the input sensitivity is 120v!!!
One last question:
I have a DAC at 3v output. And my music streamer is connected to my DAC via coaxial, has 2v output. Don't know if the streamer's numbers are important here or not??
I assume every amplifier manufacturer gives the input sensitivity as per watt not at full power.
Let's say there are two amps.
Amp A:
Input sensitivity: 500mv
Input impedance: 100kOhm
Power output: 40 watts in 8 ohms
Amp B:
Input sensitivity: 100mv
Input impedance: 30kOhm
Power output: 100 watts in 8 ohms
So Amp A should be 500mv x 40 watts =20000mv/2v
Amp B should be 250mv x 100 watts = 25000mv/2.5v
Some amp manufacturers write 1.5v for input sensitivity in their specs. Does this still mean per watt? Because if so then at full power it is crazy high. Say if they mean 1.5v per watt and its output is 80 wpc. Then the input sensitivity is 120v!!!
One last question:
I have a DAC at 3v output. And my music streamer is connected to my DAC via coaxial, has 2v output. Don't know if the streamer's numbers are important here or not??