@atmasphere
Looking at @dep14 's calculator, it looks like for a typical home-use power cord (16/3 cable at 6' in length) you're only losing .6% of your voltage on a typical 120v 15a home circuit, that seems like it would be well within the margin of error of typical home outlets (which can range from 110v to 130v in actual output).
Given that amplifiers are typically running at a small fraction of their maximum output (only a few watts except for very short dynamic peaks) how would such a small reduction in voltage actually be audible? If it were wouldn't you see an even bigger difference between two homes where one's AC voltage averages at 117 volts and another whose voltage averages at 121 volts?
Looking at @dep14 's calculator, it looks like for a typical home-use power cord (16/3 cable at 6' in length) you're only losing .6% of your voltage on a typical 120v 15a home circuit, that seems like it would be well within the margin of error of typical home outlets (which can range from 110v to 130v in actual output).
Given that amplifiers are typically running at a small fraction of their maximum output (only a few watts except for very short dynamic peaks) how would such a small reduction in voltage actually be audible? If it were wouldn't you see an even bigger difference between two homes where one's AC voltage averages at 117 volts and another whose voltage averages at 121 volts?