The smallest step-change in amplitude that can be detected by ear is about 0.3dB for a pure tone. In more realistic situations it is 0.5 to 1.0dB'". This is about a 10% change. (Harris J.D.). At medium volume, the voltage amplitude at the output of the amplifier is approximately 10 volts, which means that the smallest audible difference in sound will be noticeable when the output voltage changes to 1 volt. Such an error is impossible not to notice even using a conventional voltmeter, but Self and his colleagues performed much more accurate measurements, including ones made directly on the music signal using Baxandall subtraction technique - they found no error even at this highest level.
The math is flawed so why would I belive anything else in the post?
A 1 volt change from a 10 volt average represents a change of 21% in power, not 10% as stated.
10V x 10V = 100V / 10 ohms = 10 watts
11v x 11v = 121V / 10 ohms = 12.1 watts
The math is flawed so why would I belive anything else in the post?
A 1 volt change from a 10 volt average represents a change of 21% in power, not 10% as stated.
10V x 10V = 100V / 10 ohms = 10 watts
11v x 11v = 121V / 10 ohms = 12.1 watts