Signal to Noise Ratio and Amplifier Distortion


Hi,

Another dumb question.

My understanding is that SNR is defined as the ratio of signal power to the noise power, often expressed in decibels. A ratio higher than 1:1 (greater than 0 dB) indicates more signal than noise.

My understanding from many posters on this site is that a lower powered amplifier has less distortion and should sound better, all other things being equal.

Assuming the above two comments are true, how does amplifier distortion affect the SNR?

Is it safe to say that the better the SNR of a preamp and source, the more you will hear the distortion from a higher powered amp?

Or asked another way, will a lower SNR component mask high power amp distortion?

Or is it, because the amp is the last item in the chain before the speakers, it will distort the signal it receives regardless of the quality of that signal?

As I have tried to state my confusion in writing, it seems to me that my last thought is probably the correct answer.

But I hope someone who knows more than me can straighten out my thinking.

Thanks for listening,

Dsper


dsper
Here’s a site that should help you find your answers/definitions: http://www.digitizationguidelines.gov/results.php?gltext=Distortion&x=0&y=0     Just search for the likes of IMD, THD, SNR, etc (gotta spell them out, with caps, or search alphabetically).      They didn’t cover crossover/notch distortion, on that site: https://www.aikenamps.com/index.php/what-is-crossover-distortion     The potential/rated output power of an amplifier has nothing to do with it’s distortion levels, unless it’s exceeded.      Factors like circuit design, parts chosen by designer, impedance mismatches, maladjustments or malfunctioning parts (among other variables), will have everything to do with it.
You’re right about SNR, that’s what it is all right, signal to noise. Try and keep in mind that a lot of these measurements go back to early days when these things were still real problems people were trying to sort out. Back in the day when static and hiss was a large part of what you heard from the old-timey radio. There was also a time, and it was as recently as the 1970’s, when some people would claim 10, 20, or 30 watts from what was really by today’s standards a 2 or 5 watt amp. If that. If they could spike a 20 watt peak for a millisecond that was "peak power"- and they weren’t exactly concerned with how clean and undistorted it was either.

So now here we are today 50 years later and its hard to find anything as bad as what back then passed for quite good. These measurements you’re talking about, they’re the most gross simplistic and now today meaningless of measurements. So its safe to say you can disregard them. Sorry. There’s a lot of measurebators here. They will disagree. Oh well.
Is it safe to say that the better the SNR of a preamp and source, the more you will hear the distortion from a higher powered amp?

No, not at all. Think of SNR as background noise. Think of distortion as the signal being messed up. Imagine you are talking on the phone and you can tell its a very clear phone call. Then you roll down the window, there’s a lot of wind noise. The SNR just went down. But you can still easily hear that the phone call, the signal, is low distortion. Two different things.

Or asked another way, will a lower SNR component mask high power amp distortion?

No. It will if anything make it easier to hear. The window example again. Less noise, easier to hear. These are very very gross examples. In audio we are typically looking at very low SNR. Especially with digital, and amplifiers, and other electronics where its very common the SNR are all way up in the high 90’s or more. Compare to a lot of analog where 65dB is pretty good. But these things dB, deci Bels, they are logarithmic. Three dB is double, 10 dB is ten times. Same as with speaker sensitivity, where a 3dB more efficient speaker requires half the power, only with SNR its half the other way. So either one of these SNR, 65dB or 95dB, there’s a huge difference yes but 65dB is still very, very low relative to the signal. Basically the 95dB is so low you don’t hear anything without your ear to the tweeter, while 65dB you can hear from a few feet away. But in either case once you start playing music the level relative to even the quietest passages is inaudible. This is why its safe to ignore SNR.

Or is it, because the amp is the last item in the chain before the speakers, it will distort the signal it receives regardless of the quality of that signal?

Well it is true the amp will distort the signal. But then everything distorts the signal. Not just the last thing, but the first, and I mean the first, going all the way back to the wall. Everything distorts, adds noise, interferes, degrades, harms the signal. Everything. That’s why none of these measurements matters. Because you can measure everything to the nth degree, put them all on a spreadsheet, how you gonna make sense of it? Can’t. Impossible. So what you do instead is listen. Tell in an instant.

Go and listen, You will see.
NOTE: Much better to use the search feature, in the top left of that first mentioned site.
What do you consider a low Watt amp ? One of the most transparent amps the Benchmark AHB2 running full power both channels driven has a SNR about 130dB. So you can play at 120dBSPL and your noise floor will be at -10dBSPL in other words silence. In bridged mode it can put out 500 watts @ 4 Ohm with .00026% distortion. At least with this amp any distortion you can hear comes from other sources.

will a lower SNR component mask high power amp distortion?
 No

it safe to say that the better the SNR of a preamp and source, the more you will hear the distortion from a higher powered amp?
Yes