agisthos,
Amir is the high priest of the ASR cult.
^^^ Not a very promising start.
Years ago, they were militantly claiming that changes to the power supply of dvd/blu-ray players could not possibly have any effect on the video output, because bits are bits and the power supply does not effect the bits.
Given HDMI has been the default transmission method for many years, that claim is correct. Luminance information, like color and other picture values, is encoded in the digital signal.
It is NOT determined by the power supply to a blu ray player. A pixel on the display either receives the information or it doesn’t. (And when it doesn’t, artifacts can occur but they are not of the type you are describing - a rise in brightness level or color saturation of the displayed image).
It was a theoretical argument,
LOL, no it wasn’t. It’s literally how digital video information works - which is shown in practice millions of times a day (at least). If it didn’t, we would be having a heck of a lot of image problems that we just aren’t having.
which was easily disproven by doing any changes from SMPS to LPS, which on the meter would measure visible changes in brightness levels, let alone color measurement differences and detail improvements.
I call B.S.
Show me the evidence you are speaking of where (I presume using HDMI) changing the power supply to a blu ray player measurably altered the brightness, color etc of the image.
I’ve been a member over at ASR forum for 20 years, where professionals and dedicated enthusiasts have been exchanging notes on calibration - using all sorts of sensitive equipment - and NO ONE has reported anything like that, nor any measurable changes between two properly functioning HDMI cables. Because....that’s not how it works.
And it’s just this type of nonsense that we can be thankful there are people like Amirm around to test.