Signal to Noise Ratio and Amplifier Distortion


Hi,

Another dumb question.

My understanding is that SNR is defined as the ratio of signal power to the noise power, often expressed in decibels. A ratio higher than 1:1 (greater than 0 dB) indicates more signal than noise.

My understanding from many posters on this site is that a lower powered amplifier has less distortion and should sound better, all other things being equal.

Assuming the above two comments are true, how does amplifier distortion affect the SNR?

Is it safe to say that the better the SNR of a preamp and source, the more you will hear the distortion from a higher powered amp?

Or asked another way, will a lower SNR component mask high power amp distortion?

Or is it, because the amp is the last item in the chain before the speakers, it will distort the signal it receives regardless of the quality of that signal?

As I have tried to state my confusion in writing, it seems to me that my last thought is probably the correct answer.

But I hope someone who knows more than me can straighten out my thinking.

Thanks for listening,

Dsper


dsper
Yes, noise and distortion are two different things.

The question, in audio applications, is at what level does noise actually become audible, other than by a few golden ears.
Hi All,

Thanks for the discussion.

Sounds like I have a lot of listening to do with my new speakers.

Dsper
Typically; you don't know what you've got, 'til it's gone, regarding noise.

Isn't that the truth +1