In defense of ABX testing


We Audiophiles need to get ourselves out of the stoneage, reject mythology, and say goodbye to superstition. Especially the reviewers, who do us a disservice by endlessly writing articles claiming the latest tweak or gadget revolutionized the sound of their system. Likewise, any reviewer who claims that ABX testing is not applicable to high end audio needs to find a new career path. Like anything, there is a right way and many wrong ways. Hail Science!

Here's an interesting thread on the hydrogenaudio website:

http://www.hydrogenaud.io/forums/index.php?showtopic=108062

This caught my eye in particular:

"The problem with sighted evaluations is very visible in consumer high end audio, where all sorts of very poorly trained listeners claim that they have heard differences that, in technical terms are impossibly small or non existent.

The corresponding problem is that blind tests deal with this problem of false positives very effectively, but can easily produce false negatives."
psag
After my last post something tells me I won't get anywhere, but here go.

"if test are not done scientifically and are not based on "opinions" they really aren't real to me. How does one measure whether the equipment accurately demonstrated the sound stage depth? dimensionality? etc. I hear many opinions of the reviewers, but based on what? What criteria? are you going by memory in your opinions and comparisons? or did you listen intently and then switch out that amp with another (without changing anything else) and listen again?

I have read of some reviews that do exactly that. And the equipment they are reviewing is compared to similar equipment within the price point. That is alright for me. But, I still prefer an A/B comparison test that is blind to really identify the sonic differences in an unbiased way.

In the first paragraph, you're talking about subjective qualities that the reviewers are discussing. We all know that those qualities mentioned can't be measured, so what would you have the reviewer do? We're supposed to be adults here. When I read a review its not too difficult to pick out the things that are purely subjective in nature. Yes, they are listening to the component and writing their subjective opinion as to what they heard. Here's the 1 detail that many people miss. Most of the people that read the reviews, the magazines customers, know this is how they do it, and its not a perfect process, but they still want the review anyway. Any why not? Why do you think they bought the magazine to begin with?

This caught my eye in particular.

"But, I still prefer an A/B comparison test that is blind to really identify the sonic differences in an unbiased way.
"

You say that you prefer this type of blind testing like there are some reviewers that are doing it. I've never seen any reviewers do this. Where are you finding them? I'm more than willing to give them a chance. If they can show me some testing that helps make a better decision, I'm all for it.
Post removed 
"Your last post leaves me with the impression you do not think there are differences in cables and therefore they cannot be heard, especially when you get defensive when I asked the results of your controlled listening tests. Beats me why you wouldn't want to disclose the results."

You're allowed to have any impression you like. It has nothing to do with me. It's your choice, not mine. As for the reason why I don't want to disclose the results, once again, I already gave it. It was clearly stated in my last post. Here it is again.

"And as to the results of the test's, its not relative to this discussion. You only want me to list the results so you can comb through them to find the slightest detail just so you can claim the whole thing is null and void, so you get to be right."
Zd542; I get what you are saying in response to my post. The closest I have seen is as I mentioned. Reviewers listening to one piece and some music, then swapping it out for another piece, without changing anything else and listening again. My point earlier and using the wine example was that blind A/B testing would show that most reviewers (not all) have no clothes and they can't have that. so the best I can hope for in this day is what I mentioned earlier.

However, companies respond to letters and not so much to phone calls and posts on chatboards. So maybe more letters to the magazines requesting A/B blind testing may help

enjoy
I agree that only a limited number of switches are needed, if the test conditions are good. What are good test conditions?: A treated room with good acoustics, high quality electronics, well-recorded music, the ability to do rapid switching (having a second person to manipulate the hardware helps), and familiarity with the musical selections. That's all you need to eliminate subjectivity and get to the truth.