A great resource to help you answer your question is Mark Waldrep's blog:
https://www.realhd-audio.com/While Mark is a source of controversy for some in the industry, he is overall not at all wrong in what he discusses. I don't fully agree with him regarding higher performance cables, however he is astute in that he does observe that there are many cable manufacturers which are simply making ludicrous claims.
In a nutshell, what it comes down to is the provenance of the file - what this means is that the closer one is to the "actual" original master copy of the recording (whether on tape, direct-to-disc, or digital), the truer the sound will be to the original recording.
The album you mention was recorded in 1986; I'm guessing here without doing much research but it was likely an analog tape recording, or an early digital recording. As there was no 24-bit/96kHz digital process for recording audio at the time, any 24-bit file of this recording would be either
A) converted from analog from the original master or an auxiliary copy (which could be several generations/copies removed from the original),
B) up-sampled from the original bit depth (14 or 16-bit) to 24-bit, and thus re-sampled at a different clock frequency at 96kHz.
A) is not a bad option assuming the master tape or the immediate copy (secondary/backup master) was the source of the 24/96kHz file. However, the A/D conversion process can be performed by a myriad of different products all with widely varying performances so there is that factor to consider. If the tape used during the A/D conversion was a few copies removed from the original, you can immediately see the potential for loss of information with successive generations of tape. Seeing as this release has something like 30 different vinyl version alone, it would indicate that each pressing studio had either a copy of the tape or a master record. Since there were so many different types of standards for cutting lathes at pressing plants at the time, who knows what happened there.
B) looks like a worse option to me because I don't believe there were many DAWs in 1986 with a 48kHz sample rate; it's certainly possible but I'd need to do some digging to confirm that, as it's my understanding that 48kHz sampling came to the fore in TV and cinema recording before it made it's way to music.
At any rate, my guess is that the files Qobuz are using are sourced from a master which varies to the one your CD contains, and that's why you are hearing differences. Of course all of this becomes more complicated when you get into things like re-mastering and subsequent releases of the same title (many, many remasters sound a LOT worse than the original, but then again there are others which do).
High-res audio is a marketing term and while I know and have experienced how much better a 24/192 captured digital file can sound, ultimately the source file is limited by the resolution of the equipment at the time and there is no possible way to improve upon it. You can make things sound "different" or "better" using re-mastering techniques, but that's only in the hands of a competent and judicious sound engineer.
Think about a movie originally captured on film and released on Blu-ray. Can the Blu-ray ever improve on the original film quality in it's pristine state? I don't think so. However, enhancements can be made to make the viewing experience more subjectively "cleaner" or "better" by using digital processes to reduce film grain, artifacts, and noise for example. Someone who is good at this will do a great job and likely deliver a better looking movie for most consumers - and it's certainly more convenient. But people who don't care, or aren't particularly adept at their craft, might royally screw some stuff up in the process and you end up with a funky looking cartoonish mess. It's very similar in the audio world when it comes to this topic.