What should be mandatory in every professional published review-


When testing a company's newest amp, preamp, etc, and it is a refinement of a prior product that was on the market, ie, a Mark II, an SE version, a .2 etc, it should be mandatory that the review includes a direct comparison with the immediate predecessor. IMHO, it's not enough to know ion the product is good; it's also important to know if there is a meaningful difference with the immediate predecessor.

I'm  fan of Pass Labs, and I just looked at a review of an XP22 preamp. I find it very disturbing that there was no direct comparison between the XP22 and the XP20. And this lack of direct comparison is ubiquitous in hi-end published reviews, across all brands of gear tested. I don't blame the gear manufacturers, but rather the publications as I view this as an abdication of journalistic integrity.

 

Opinions welcome- 

128x128zavato

+1 @jl35 

at this point it is safe to say that most reviews are flawed and not an undeniable reference for purchasing anything.  they are constrained from mentioning anything remotely detracting about the sound and in fact barely describe the sound character at all. they simply cannot bite the hand that feeds them

 

That was not my experience.  In 15 years of reviewing I was never constrained from saying anything negative.  Over that time I only wrote one negative review, not because I was constrained but because if a product reaches a level where it gets a review it’s likely gotten very good feedback or comes from an established company that knows what its doing and doesn’t produce bad-sounding products.  Point is, reviewers almost without exception get gear that sounds good, which is the main reason you don’t read many negative reviews.  Reviewers don’t want to review crap products and magazines/sites don’t want to review them either.  Plus, in this day and age it’s hard to even find something that just sounds bad.  That said, a reviewer should absolutely point out areas where a product might be a bit compromised, which is why I always found product comparisons to be the most interesting part of any review as it provides extremely useful context for the reader. 

most of the text in a review has nothing to do with how it sounds.  they are mostly fluff that includes company history, the new technology and why it should sound better, room and system setup, and maybe a little about how it sounds

This I absolutely agree with and has been a pet peeve of mine for a while.  Too many reviews contain 80%-90% background info, specs, etc. and only dedicate a couple paragraphs on how something sounds.  Plus, in this day and age when anyone can go to a company website and get a lot of this information I’d rather a review to just include a link for the component under review and refer the reader to that to get certain info unless there’s something notable about a product’s design, specs, etc. that warrants further explanation.  Anyway, that’s my take FWIW.

 

 

Note that reviewers live in a world of allegedly quantifiable differences as they have access to myriad items that the great unwashed do not. That said, and having said that, your and my ears hear different things and I for one have proven to myself the value of my tastes over all, as I tend to use my own earballs to make audio decisions. Example: New Klipsch Heresy IV...bought a pair and decided that although they are somewhat better built than the IIIs I already had (better speaker mounting screws and binding posts, ports lowering the bass frequencies by 10hz or so, AQ wiring), the IVs redesigned mid horn (simplified poly drivers replacing titanium full throated drivers) had a high-mid frequency bump that sort of yelled out. I couldn't stand it so I sold 'em and am happy with my lovely IIIs. I also sent back some well regarded Sonists, and a pair of pretty ZU Omens that were sort of awful. How anybody likes Dirty Weekend speakers is a mystery, but I don't really care, and hey...what do I know anyway? All well regarded with great reviews, but until you have them in your listening space you simply don't know. For me the reviews should be well written and entertaining, with expected grains of salt available in any readers head.

Note that reviewers live in a world of allegedly quantifiable differences as they have access to myriad items that the great unwashed do not. That said, and having said that, your and my ears hear different things and I for one have proven to myself the value of my tastes over all, as I tend to use my own earballs to make audio decisions.

@wolf_garcia Exactly.  And that’s why the product comparisons in reviews are so critical (and why I hate TAS reviews as they don’t bother to do them) as they give you relative context that at least to some degree enables you to calibrate your hearing versus the reviewer’s.  

There is no way to "calibrate" your hearing...you simply have to hear things yourself as (previously stated I thought) generally you not only haven't heard the item being reviewed at length in your system, you likely haven't heard any of whatever the other compared items are, again, at length in your system. What was that quote?, "Writing about music is like dancing about architecture." Maybe from Martin Mull...in any case, again, if the writer is interesting I feel lucky, but there is no proof unless I can get my hands on the pudding.

Interesting thread.
Those folks who have the “immediate predecessor” product might be able to glean certain insights from a new product review. That may come from a variety of sources, such as the reviewer’s personal biases in reproduced sound (and whether they resonate with those of the reader) as well as prior reviews. For example, I like a lot of what PHD of TAS reviewed because I have similar biases to his own. It’s just a flavor for what might be of interest to me if I were in the market for that particular item covered by the reviewer. I suspect reviewers develop a faithful following based on the contents of their reviews, regardless of product comparisons they might make. So if the OP demands certain requirements of the reviewers that he follows, then that’s his preference. But every reviewer has a different audience who might differ in what they want from a given product review and reviewer. Mandatory requirements? Maybe for certain audiences yes, but by no means not all audiences.