Determining exact power being sent to your speaker


How does one go about determining the exact amount of power being delivered by the amplifier to the speakers? Let's say the amp delivers 160w/channel at 4ohms (or so the reading materials state anyway) but yet the speaker specs show 120w maximum. My current integrated amp shows the volume
attenuation (dB units) in .5db steps ranging from -100 (min) up to 0.0 (max).

All that said, the concern is when should I be concerned about pushing too much power to the speakers and how I can determine the "Don't turn it up past this number on the display or you will damage the speakers..."
vineman55

use 12v lamp car brake to save tweeter ,, paraleling if you want more power 

What rrog said. I started a thread about that exact same thing with an old Realistic power meter and was so surprised I was only using a watt or two listening at a decent volume.

The first thing I'd suggest you do is buy a sound level meter. Ones for home use are relatively inexpensive ($20 to $30) or you can download an app for your smartphone for even less, or free. This will give you a ballpark idea of just how loud you listen. (One's person's "loud" may be someone else's "medium", etc.) For example, my "loud" these days is around 85 dB average. 

Then, look up the sensitivity of your speakers. A lot of modern speakers have ratings in the upper 80s -- for example my Ohms are 88 dB -- it takes one watt of power to produce that sound level. Small bookshelf speakers tend to be a bit less sensitive, while many larger speakers, like horn systems, tend to be much more sensitive -- they will play much louder on that one watt of power. 

Even if one watt of power gives more volume than your average listening level, you still need more power than that from your amp. Music is dynamic with loud, short duration peaks above the average playback level-- such as a drum strike or orchestra crescendo. 

Unless you are a head-banger that likes your music at an average volume of 100 dB or more, there usually isn't much concern about having too much power, though some do get carried away on this issue. I happen to be in the camp that is more concerned about the sound quality of an amp rather than whether it has a zillion watts. 

The amplifier power is never continuous. It’s 0W at silence and maximum wattage at the loudest.

A speaker’s power rating is based on maximum output. If a speaker is rated for 100W maximum, you can safely use a 300W amplifier with it so long as you don’t play it too loud.

You can calculate "too loud" bu using the power to db conversion. That is, if a speaker is rated for 100W, and 90 dB sensitivity:

dB = log10(100) = 20.

That is, at 100W input the speaker will play it’s rated sensitivity + 20. So, if your speaker is 90 dB at 1 watt, it will be at 110 dB at maximum. Just don’t play your speaker more than 110 dB (at 1 meter or yard) and you’ll be fine with any size amp.

https://www.rapidtables.com/convert/electric/db-converter.html

Vrms x Irms = Pavg. This average power is equal 0.5 of peak power for sinewave.. Pavg represents dissipated heat while rms value of power curve would be 0.62 of peak power for sinewave and it does not represent anything.