Are integrated amps technically better than separates?


I'm assuming we are talking same class of amplifier and the integrated has the features you want. I'm thinking the integrated could actually be an improvement over separates due to being a more "direct" connection. Taking away the flexibility factor of separates, is my line of thinking correct?
aberyclark
Hi @atmasphere ,

Each amplification stage, capacitor and cable add distortions and noise.
All those gain stages are in an integrated amp too. Usually though you have a greater chance of lower noise with separates since crosstalk and intermodulations from the other channel won’t exist.
But pream has to have output stage that drives interconnect and relatively low impedance (compared to tube input inside amp) of power amplifier input. The interstage driver in integrated amplifier is much easy to design and doesn’t have to have a feedback and output transformer or big value capacitor or sophisticated DC cancelling circuit.
But pream has to have output stage that drives interconnect and relatively low impedance (compared to tube input inside amp) of power amplifier input. The interstage driver in integrated amplifier is much easy to design and doesn’t have to have a feedback and output transformer or big value capacitor or sophisticated DC cancelling circuit.
The first statement is true. The second needs more explanation; as far as I can see its false. Regarding the first statement, over the years I've found that a lot of the improvements I've been able to make in our gear have related to power supplies. Arguably that's one of the trickier things that goes on in an integrated amp; IME it outweighs many of the other circuit topography issues! I totally get the connectivity issue- that is an advantage. But in most cases, its not *enough* of an advantage.
Agree with “erniesch", his statement is 100% correct: "Never mind that just inside the wall is the cheapest copper lines the contractor could buy.” Last two feet power cable 10 or 18 AWG difference is drop in the ocean, if power lines inside house are bad.  
@audition__audio: if your power supply, AC outlet before/after including,  is bad, comparing separates to integrated is kind of lower improvements pareto thing. 
Agree with “erniesch", his statement is 100% correct: "Never mind that just inside the wall is the cheapest copper lines the contractor could buy.” Last two feet power cable 10 or 18 AWG difference is drop in the ocean, if power lines inside house are bad.  
OK- if that is true you have to explain how I could measure a voltage drop across a power cord and hear the difference in my system- when my house at the time still had knob and tube wiring.


All things electrical have to obey Ohm's Law and its a simple fact that solid core copper does better then multistranded wire, which is why its used in building wiring. Keep also in mind that in most houses there is a wiring code and while there is cheap wire, it has to meet that code.


Now the measurements I made point to the idea that the more current draw, the greater the voltage drop across the power cord; this suggests that its more of a problem with equipment that has a higher current draw and far less of an issue with equipment that does not. Ohm's Law again.

But one other area that should be mentioned is high frequency response of the power cord- another thing that solid core does really well. This issue here is that in most equipment there is a power transformer, rectifiers and filter caps. The rectifiers can only turn on (commutate) when the cap voltage is less than that from the transformer; most of the time this means that conduction only occurs at the very top of the AC waveform. So the current draw has to happen over a fairly short period of time, even less than a millisecond. If the bandwidth does not exist in the cable, the power supply will not charge properly- it will round the charging pulse. Again, Ohm's Law.


None of this says the cable has to be expensive. It does say that the cable has to have the bandwidth and the current capacity, and good enough connections at either end such that those ends don't warm up over time- if they do, you know the connections are robbing power from the system.


This is all measurable and audible. In the case of an amplifier simply measure the full power output, the output impedance and the distortion. You'll find that it varies with the input voltage. This should be totally non-controversial; I think the only reason it is is because there tends to be a knee-jerk reaction to the idea that a power cord can make a difference, associated with a failure to cause one's hand to move and actually make the measurement!