I am curious where people found fault with the video that Paul77e posted. It seemed to me to be a reasoned argument, based on science, amplifier design, and hands on experience.
I am not an audio designer, electrician or physicist, but I did work in electrical manufacturing of power regulating equipment for some years, and I am hard pressed to imagine how a power cable upgrade could make a difference to sound, so long as it is capable of carrying sufficient current. And even if you were pushing enough current through it to create resistance, isn't the worst result just a (very) slight voltage drop?
The whole thing about DIY cables using silver wire and hospital grade plugs seems silly. Hospital grade plugs are simply more robust to handle greater physical force from pulling, and have a little more insulation to prevent contact shock. And what possible difference could four feet of silver wire between the outlet and the component make when everything else from the generation source to the outlet is copper? It's not as though the electricity magically starts at the wall.
If folks hear a difference, great. That's all that really matters. But honestly, how many who heard a difference did a real double blind test?
I would suggest that power quality is a much bigger issue for all electrical equipment, including audio. Most PQ problems are voltage related, and most of those are sags. Even a very short duration sag can mess with electrical equipment, especially digital switches and power supplies. Installing a real voltage regulator (very expensive) or a UPS that puts out perfect sine wave power would be a good investment.