cable break in


had a friend ask me if after you break in/burn in your cables are they more or less conductive? i would have to say less conductive, but not sure why? does anyone have a good answer?
hemidakota
Nsgarch, I don't know how you calculate current to a speaker for a known power consumption, but your example using your Levinson is incorrect.
Your example;
"An example would be my Levinson amp which will provide 400W/ch into my 4 ohm (nominal) electrostats, but at the loudest listening levels I can stand, it's only drawing 400W from the wall (or 3.3A) and it's only putting out around 150W rms of audio power, which at its 67V (26dB) gain, is only around 2.2A to the speakers (vs. 3.3A from the wall.)"

If 150 watts are being fed to a 4 ohm speaker, then I2=150/4=37.5. Therefore I (amperage) = 6.12 amps.
Clearly that is higher than the 3.3A pulled from the wall.

At any rate the amperage to the speaker will always be higher than the amperage from the wall to amp, because the voltage to the speaker for the same wattage as pulled from the wall is lower then the wall voltage, therefore the amperage must be higher to be of the same wattage.
inpep: You are using the formula A = (square root of) W/R.
I am using the formula A = W/V. There is also another formula (ohm's law), A = V/R.

They should all yield the same result, so perhaps we're just plugging in the wrong numbers? Additionally, there are some cofactors when using AC, although I'm pretty sure the output of an amp has no phase angle.

There's a neat formula wheel at:

http://www.sengpielaudio.com/calculator-ohm.htm

and on the following page.
Nsgarch, I am plugging in the correct number. I don't think that the V (I don't know where you get 40 volts) that you are using is correct. Why use V at all, when we have R and P and a formula to calculate I?

24.6 V is the correct voltage at the amp to give 150 watts into 4 ohms.

As I have stated before, the amperage from the amp to the speaker will always be higher than the amperage to the amp from the wall, since the voltage applied to the speaker for the same wattage as the amp will always be lower than from the wall (120 V). I don't know of any speaker requiring more than 120 Volts for them to work!

With respect, Bob P.
Bob, I think what you're forgetting is that the voltage output of a given amp is a constant just like the voltage in a wall socket. The exact amount of voltage is is a function of the gain multiplier the amp is designed for, which for most amps (regardless of output capacity in watts) is about 25dB +/_ which translates into about 60V +.

Amp output in watts is determined by the strength of the input signal as you turn the volume on the preamp up or down. And the current (which varies with the amount of watts the amp is putting out at different volume levels) is a function of the impedance of the load being driven.

So what I'm trying to say is it's the watts that an amp puts out that changes with the volume. And since the load (usually) and the voltage are constant, the only other variable is the current. That's why little amps run out of gas (clip) when trying to drive current-hungry speakers (like big multiple driver boxes or stats with low impedance) because they can't get the watts/current they need to produce decent sound pressure levels.

So thinking of an amp as a great big "equal sign" with ohms laws on each side in a balanced equation is not how things actually work, plus there's also the additional issue of amplifier inefficiency to take into account.

.

Nsgarch, a speaker's volume output varies with voltage and the power consumed is a function of the current and the voltage at the speaker for that voltage.
The amplifier takes in a signal of a certain voltage and increases the voltage to a level which the speaker can respond to. Of course the voltage at the output of the amplifier varies and isn't constant (otherwise the speaker would not get louder if the voltage did not increase) and the speaker draws what ever amps it needs at that specific impedence to produce sound at the given level. So a speaker that has 2.83 V/8ohm/90db sensitivity figure will need 28.3 V for a 100db output and 10 watts consumption. The amp will be suppling about 0.35 amps to the speaker and itself pulling 0.12 amps from the wall, still lower than the speakers.
Your figure of 40V is if the amp at full power at what ever wattage it is capable of supplying (or current) into the speakers at the speakers' impedence. If that power is 400 watts then the current is equal to 10 amps. The current drawn from the wall at 400 watts is, of course, slightly higher than 3.33 amps, still lower than the current going to the speakers.

With respect, Bob P.