If MQA were to actually be an ADVANCEMENT for the digital media distribution industry, it would have provided meaningful compression of digital files WITHOUT ANY PERCEPTIBLE LOSS OF CONTENT INTEGRITY/FIDELITY. We all know that 16 bit and much more demonstrably - 24 bit digital media precision is largely a waste when it comes to real world signals. The full dynamic range that comes with 24 bit systems translates to 140 db!!! That's enough to make your ears bleed! So clearly, there is potential in the marketplace for a coding scheme that scans digital files, records dynamic peaks in the content, and adjusts bit precision accordingly to fully accommodate the individual file's needs before encryption or compression takes place to facilitate more efficient transfer to the intended target. A new industry wide digital standard could place dynamic range information somewhere in a predetermined location in the media content that signals to adaptive encoding/decoding equipment what algorithm to use to fully accommodate the file without padding it with a hole bunch of 1's and 0's that don't change before it is either transmitted or stored on media. MQA could have done this and dispensed with the entire fraudulent "time correction" BS. They would have provided some factual justification for existence in being able to legitimately say - you now have lossless transmission that is more efficient than the current standard. Unfortunately, they lost all credibility when the actual response data showed a degradation of the signal (no longer lossless) while they were claiming WITHOUT ANY PROOF WHATSOEVER that time domain errors allegedly inherent to PCM were being fixed.