I haven't used many audio rippers but with EAC it tells you how many errors are in the rip and if any error correction was used. However, 99% of my audio ripping has never required error correction to be applied. Even with high speed ripping.
My friend who is a tech support engineer (for customers who design equipment) at sanyo, supporting cd rom products has said that bit perfect reads are common place now.
Imagine, if reading a cd is hard, how much error correction would be needed for sacd or blu ray? It just doesn't make sense to assume that a cd drive at 1x speed can't accurately read a cd when a cd-rom drive at 2x or higher can without correction.
Some things in that 6moons article may be right such as power supply affecting servo and diodes (which can be mitigated with proper design)But I'd have to argue that cd is digital. It is, afterall, a representation of a decimated signal that came out of an ADC.
Having said all that, I don't believe that anyone knows why digital can sound different - transports, digital IC's if anyone is interested. The only parameter we know of is jitter to explain differences after accepting the bit perfect argument. Which is why hifi manufacturers don't want us to think that.