Agree with “erniesch", his statement is 100% correct: "Never mind that
just inside the wall is the cheapest copper lines the contractor could
buy.” Last two feet power cable 10 or 18 AWG difference is drop in the
ocean, if power lines inside house are bad.
OK- if that is true you have to explain how I could measure a voltage drop across a power cord and hear the difference in my system- when my house at the time still had knob and tube wiring.
All things electrical have to obey Ohm's Law and its a simple fact that solid core copper does better then multistranded wire, which is why its used in building wiring. Keep also in mind that in most houses there is a wiring code and while there is cheap wire, it has to meet that code.
Now the measurements I made point to the idea that the more current draw, the greater the voltage drop across the power cord; this suggests that its more of a problem with equipment that has a higher current draw and far less of an issue with equipment that does not. Ohm's Law again.
But one other area that should be mentioned is high frequency response of the power cord- another thing that solid core does really well. This issue here is that in most equipment there is a power transformer, rectifiers and filter caps. The rectifiers can only turn on (commutate) when the cap voltage is less than that from the transformer; most of the time this means that conduction only occurs at the very top of the AC waveform. So the current draw has to happen over a fairly short period of time, even less than a millisecond. If the bandwidth does not exist in the cable, the power supply will not charge properly- it will round the charging pulse. Again, Ohm's Law.
None of this says the cable has to be expensive. It does say that the cable has to have the bandwidth and the current capacity, and good enough connections at either end such that those ends don't warm up over time- if they do, you know the connections are robbing power from the system.
This is all measurable and audible. In the case of an amplifier simply measure the full power output, the output impedance and the distortion. You'll find that it varies with the input voltage. This should be totally non-controversial; I think the only reason it is is because there tends to be a knee-jerk reaction to the idea that a power cord can make a difference, associated with a failure to cause one's hand to move and actually make the measurement!