Tuesday, 10 April 2012

Best In Test?

Gear: we all love it, don’t we? Let’s face it, after talking about where we’ve been to it’s probably the number one topic of conversation. And every month, page after page of magazine or blog space is dedicated to discussing the finer points of this versus that.

Increasingly bloggers are contributing a wealth of information about products and how they perform, mainly through the medium of the single item test. The advantage of this is that an in-depth review can be formulated based on extended use of the item in question, usually tested in real-life situations.

Very occasionally a blogger may get sent an item for testing from an obliging source. Normally this comes with the proviso that the retailer or manufacturer concerned receives a name check and, in some cases, that the product is returned afterwards (for example if it’s a prototype). However, in spite of the apparent generosity of the suppliers, most bloggers manage to remain impartial in their findings, and will offer balanced reviews as far as their experience will allow them.

However, despite the advantages outlined above, there is a downside to such testing: it is rarely able to be comparative. I don’t know about you, but I don’t have enough spare cash to go out and buy all the latest gear with the sole intention of seeing how good one product is against another! For this, we rely on the group test.

In the main, group tests are the preserve of the big circulation magazines: those organs of the press with sufficient penetration to reach thousands of prospective buyers, and whose pronouncements are capable of making – or breaking – a product. Rarely, if ever, will an individual blogger get the opportunity to test all the latest releases against one another in a comparative test.

Obviously, with the increased number of bloggers contributing on-line individual product reviews, the position of the professional reviewer has subtly altered (by “professional” I mean those who work for the magazines in question). However, the professional reviewer is still in a position of some responsibility given that their views will be widely read and may, ultimately, influence the development of future product.

The question is: do they wield this responsibility … er … responsibly?

Around a year ago, two magazines were reviewing lightweight waterproof jackets. As I was reading I noticed something odd – the same jacket was awarded a 100% score from one magazine (and given “Best In Test”) yet scored only 60% in the other, just about the lowest mark given. How could this be? The same jacket assessed for the same purpose. What were the circumstances that threw up two such diverse summations of the same product?

Of course, this happens all the time. We all have our own favourites, whether we are judging baked beans, washing powder, cars, TV shows or whatever. It’s only natural: we are all different. But it got me thinking: surely, part of the reviewer’s job in a group test is to put personal preference to one side and determine what might be a good purchase for ALL potential users/buyers?

Now I’m not suggesting any bias here, or any intention on their behalf to deliberately mislead, simply that sometimes these reviews may not be as impartial as we are given to understand and that, even with the best of intentions, personal preferences might cloud the tester’s objectivity.

Amongst this months’ crop of outdoor mags I came across another gear test that threw up a couple of anomalies. For the sake of the magazine and reviewer in question, I won’t make too detailed a reference. What I will say, though, is that both have respectable credentials amongst the outdoor fraternity, and both are eminently more qualified to discuss the merits of a product that I am myself.

However, as far as I could see, it became quite clear, quite early on that a particular perspective were being applied to the test – and one that not every potential purchaser of the product might apply. As a result, it is my own personal opinion that one or two decent products were unfairly marked down and at least one product with an identified (and possibly major) flaw was scored second only to Best In Test. That can’t be right, can it?

Sadly, it’s quite usual to see decent products unfairly pilloried. Often, this is because it fails some reviewer’s self-selected opinion of whether it is right or not. Take weight, for example, one of the biggest and most arbitrary culprits, with one version being deemed too “heavy” and another “light” enough when, in reality, there is only a few grams between them. The stupid thing is that very few products at all are “heavy” these days – just compare them to 10 or 20 years ago!

Of course it is much harder to slag something off if you have paid good money for it, unless it’s a complete turkey, as it suggests bad judgement on behalf of the purchaser. Equally, though, it is easy to do so if you haven’t had the anguish of parting with your cash.

So what does all this tell us? Very little, really, except that human nature is what it is, and will often prevail no matter how hard we try to be impartial. But it does illustrate two things to be mindful of when assessing group tests: one; use any reviews as a guideline only and, especially on big-ticket items, try before you buy, and two; it is much better to express conclusions as “I don’t like this” (opinion) rather than “It is bad” (statement of fact) because it may well be good for someone, sometime.

This might seem perfectly obvious, but with so much more gear being bought on-line, and with an increasing preponderance of stores with only quite basic product knowledge, doing your own homework and understanding what YOU want out of a product seems even more essential than ever.

As I said before, I'm sure there is no intent to mislead. It's just that there are so many more commentators out there that buyers and users are getting a bit more clued up these days. I may have no engineering qualifications or wide knowledge of the whole outdoors product market, but I have been walking up and down the UK and abroad for 35 years or more, so can claim to have some experience of what may or may not work! I'm sure there are thousands of others with similar experience too, all of whom can add to the cache of information we can call on.

So, are gear tests providing you with the information you need?

What do you think?

14 comments:

  1. Excellent blogpost Jules!! :)

    Funny I was just typing up something similar on my blog. I'll edit it to include yours if that's ok? It's all relevant.

    ReplyDelete
  2. Hi Terry

    Yes, of course - no problem! Feel free.

    I'll keep an eye out for your post, too (I've got a bit of catching up to do!).

    ReplyDelete
  3. Thanks Jules. It just ties up nicely you see. Cause I cover some of your points with some 'subtle' answers and observations I've made over the past couple of years with regards to the outdoors industry and media :)

    ReplyDelete
  4. I generally use the magazines as a guide to whats out there, but I'm human and can be swayed. Though I've noticed that most items are given fairly high marks so they have to be seen as a much of a muchness! I probably pay more attention to items given rubbish marks.

    However after saying all that I've seen some reviews of kit I already have and whilst reading a indifferent review think to myself what the hell are they on about!

    ReplyDelete
    Replies
    1. I think we have probably all had an example of products we like being rubbished in the press. I'm not saying I'm right and they're wrong, just that I'm right for me!

      Delete
  5. I am always annoyed when a tester tells me an item is "too expensive".

    I'll be the judge of that. All I want to know is if it is any good or a turkey. I'll make my own value judgement.

    ReplyDelete
    Replies
    1. Quite. And the same goes for weight, fit, etc. How do you know what shape I am? Take Mountain Equipment jackets, for example - for me, they just seem to hang funny and feel uncomfortable on me, and I don't find their hoods cinch in the right way to get a good fit, either. So no matter what the mark, I'm not going to buy it. On that basis I could give their stuff bad marks, but it could be cracking gear for someone else - and is, by all accounts!

      Delete
  6. Some good points well made. I've thought this before. There are some brands that, in my opinion, get preferential opinions and awards. I'm not saying some of them aren't deserved but...

    ...you know what I mean??

    I will probably still continue to buy these outdoor publications. I agree with you and had a couple of discussions on twitter about the value of their opinions. I will only ever use them as guidance. Like you say experience and seeing them physically is the most important answer.

    I always go to bloggers first since I found this community. Thanks, Davy

    ReplyDelete
    Replies
    1. Hi Davy.

      Thanks for the comment!

      Like you, I will probably continue to keep an eye on the outdoor publications - for research purposes, if nothing else, as they have access to a lot of info reasonably hot off the presses, which we (the blogger community) might not.

      I probably don't read any of the mags often enough (with one exception) to know what their current favourites are, although I've certainly noticed trends in the past - or the glaring omission of some manufacturers in gear tests!

      Delete
  7. Good points, Jules. I think you have to go beyond the conclusions of the reviewer sometimes and judge the items being reviewed in the context that you would be using them. I find the comparative data given on various competing products very useful. However, the reviewers can't report very well on durability, so that's a matter for judgement, and that's also where our 'used and abused' reviews as bloggers can be useful to others.

    ReplyDelete
    Replies
    1. Yes, durability is one area where magazine gear tests always stand a chance of being bettered by blogger reviews, and the sort of feedback provided this way can prove very useful to prospective buyers.

      Comparative stats is good, and this should be a plus for group tests - bringing such info together in one place for an easy assessment, and the ability of the reviewer to distill and compare the essence of each product.

      What concerns me, though, are the conclusions drawn by the reviewers and whether they are suitably free from personal preferences to be of value to prospective buyers, especially those who may be a little less familiar with the product in question in the first place.

      Delete
  8. I hear what you are saying about weight not being the only criteria. But one of the major mags embraced the lightweight ethic a few years ago and the other major one didn't. Perhaps they did not want to upset their advertisers who have only recently started to produce lightweight stuff. In fact i remember looking in disbelief at a rucksack best buy which weighed 2.5 kg at a time when the other mag was telling us about 1kg and less sacks.
    I have noticed recently that some mag reviews don't include all possibilities. Like the trail shoe review that didnt mention inov8 s. Surely one of the most popular brands. Presumably even they are relying on free test stuff.

    ReplyDelete
    Replies
    1. I think embracing the lightweight philosophy (or not) is entirely up the individual publication - some go with it, some don't. And that is fine as long as it is clear to the reader.

      We have all spotted discrepancies like the one you mention about the rucksacks, and I guess they will continue to crop up. What worries me more is when (made up example coming up, but you'll get the point) a sack weighing 1.3kg is lauded as "light" and one at 1.4kg slaughtered for being "heavy". My contention is that at those minimal weight differences, it depends entirely on the comfort and balance of the sack on the individual as to which is the best, as well as performance and build characteristics. Weight alone is too blunt an instrument by which to judge a product, yet that prejudice surfaces again and again in reviews.

      Also, your comments about upsetting advertisers and omitting certain brands (or, more worryingly for the mags, brands choosing not to have their products reviewed that way) is perhaps getting right to the crux of the issue.

      Delete
  9. 'Best in test' is almost always a ridiculous concept in itself - obviously. It depends on personal requirements and preferences, yet it's used almost universally, they simply must name a 'winner' to appeal to simpletons.
    Durability will always be a big problem, even when testing is done over an extended period: I've seen the way some people treat their belongings, and product failure is sometimes nothing to do with its quality!.

    ReplyDelete