@decooney Sure, but also note that the NASA email (granting it's authentic) is concerning a laser diode application that bears little resemblance to audio-scale electrical properties.
So, the test is very simple: take a high resolution audio stream consisting of music (you can do single tones/sweeps later). Connect the streamer to an excellent DAC using modern USB cable. Connect DAC analog output to excellent ADC using Test Cable A. Record audio. Replace Cable A with Cable B (try a bunch of budget through to expensive). Record, record, record. Align the digital signals and subtract them from one another using standard software tools. Any residual signal represents differences between the cables. If the residual spectrum is bass-rich, that might be called "tubby" by the subjectivists. More high frequency might be called "more resolving." Fair enough!
OK, so we looked at analog XLR interconnects in the above scenario. We can also do speaker cables by getting a calibrated microphone like the Umik-1 from MiniDSP and recording the track again through the streamer > DAC > amp > speaker chain. Swap cables and then compare once again to look for why folks think there's an obvious difference.
Do the same kind of thing for power cables, power conditioners, ethernet cables, etc. There may be differences! The SPDIF cables from 2013/2017 mentioned in posts above showed flaws, but SPDIF is notorious for jitter issues, especially in the old days.
We need some exciting results for the other categories of cables!