September 10, 2013

Cricket Magazine Bat Tests


When it comes to buying your new bat for the season ahead, are the cricket magazine bat tests of any practical value? This question passed through my mind when reading the 2013 gear test in All Out Cricket (AOC).  I had my doubts. So, with curiosity piqued I looked at the test alongside previous year’s tests of AOC and The Cricketer to try and answer my question on value.



In these magazine tests, mass produced bats from big brands are compared equally alongside low volume, and ‘custom’ bats from small and niche companies.  High-end G1 bats costing well over £400 are set against mid-range G2/G3 bats costing less than £200. There appears to be no criteria specified to bat makers on their submission, such as bat weight, and grade. Testers can be a mix of pros and club cricketers (AOC), or club cricketers and magazine staff (The Cricketer). They are all asked for a subjective view on Performance, Pick-up/Feel, and Looks/First Impressions. Therefore, much like Tim Bresnan's batting, the results are going to be highly variable. Consequently, are the comparisons they make meaningful?

Firstly I assumed that all bat makers would attempt to submit their best bats. These will not necessarily be representative of the quality through a complete range, or a batch of willow. Therefore, you could view the tests as an even field of unrepresentative bats, compared to what you might have an offer at your local retailer or buy 'pot-luck' online. However, a curious comment on a Custom Bat Forum discussion about the AOC 2011 test from a bat maker said they submitted a grade 3 'dog' of a bat to AOC as that was all they had in stock.  However, I have my suspicions that this was throwing in an excuse
, since Mongoose (for it was they) didn't fair well in their test, but came top in the Cricketer 2011 test with a CoR Premium (grade 2).

The Tests
AOC magazine tests 40-50 bats shortlisted from 100+ bats. It is staged as a blind test, where all branding is hidden. It then publishes the test results of the top 10, although in 2013 it extended this to top 20.  In 2013 each bat was marked and given a score out of 120 (2013), based on Look/First Impressions (30), Pick-up/Feel (30), and Performance (60).  In previous years it marked out of 100, 400, and 500 in 2012, 2011, and 2010 respectively.





The graph below presents the scores of the top 10 bats in the last four years.  We see that year-on-year the percentage mark given is inconsistent, which must relate to variability in the chosen testers, the conditions, and protocol of the test. In addition the results also show that price bears no relation to performance, where a £395 bat is outperformed by a £180 bat. In 2012 the top place went to the Kookaburra Kahuna Players at £500rrp, yet in 2013 the same bat could only muster 17th.


Top ten bat scores from AOC Magazine 2010-2013

The Cricketer magazine has a different approach. They test 40 or so bats, but all branding is visible, unlike the blind test that AOC undertake.  Until 2013 they used a star rating method, with a range from 1 to 5, with 0.5 intervals, so a 10 point scale. There was no information provided on the criteria and the relative weighting used to formulate a final assessment. It is also harder to have a clear rank order with this method, and bats are found in groups of equal star ratings. 


Top twenty bat scores from The Cricketer Magazine 2012
In 2012 they introduced an additional tester, a 'secret' current first-class cricketer with a pseudonym of 'The Don'. Rather than combine The Don's opinion and rating with the other testers they put it separately alongside the group result. While the outcomes are generally similar, there was one confusing result for a reader; where a Kookaburra Recoil Players got a sub-par 2* from the regular test group, and the Don the same bat a near-perfect 4.5*.

There are some prime examples of the flaws in the Cricketer's 2010-2012 tests as a good guide to what brand and bat range is good for that year.  In 2011, the Mongoose CoR3 Premium at £225rrp came out best. While in 2012 a Mongoose CoR3 Super Premium at £345rrp could only manage 20th place. In 2011 a M&H Amplus at £385rrp was placed 15th, while in 2012 a M&H Solution Type A at £150rrp was 6th.

However, in 2013 The Cricketer refined their presentation to include the criteria and a rating out of 10 (to 2 decimal places).  The criteria included a subjective assessment of build quality, pick-up/feel, performance, and value for money.


Top twenty bat scores from The Cricketer Magazine 2013



The Don was used again in 2013, but in a different way.  He was asked to pick his top 10 off the shelves of the Lords Shop, and then the top 6 from the day's testing were added. He then paired up bats and compared them in a 'match-play' format, where the best of the two went through to the next round.  This meant rounds of 16, quarter-final, semi-final, and final. The pairings looks a bit arbitrary though, with a M&H Amplus (top grade 1) put against a GM Octane 707 (second level grade 2). So I wasn't convinced this 'Round of 16' by The Don was of any use or interest at all, as it was likely to create a biased result.

And so to Value
In the end I could only draw two points of practical value from these magazine tests.
Value #1 - You see what's on offer from a large range of brands - more than your average shop, and ahead of the bats being for sale on-line.  But there are many brands, particularly the smaller ones not included, and in The Cricketer's 2013 test it narrows even further to bats stocked in the Lord's shop.
Value #2 - You should never pay more than £200 to get a very good bat.

From discussion on the Custom Bat Forum on the 2013 Cricketer Good Gear Guide, it is clear that most are sceptical of any value in the tests for buying a new bat.  Similar sentiments made about the 2011 AOC Gear Test too.

Why do bat makers participate?
Featuring highly in the magazine test is no doubt a tasty marketing opportunity, especially for the winners. However, there can be no sense of whether that bat will do well or not because of the large amount of subjectivity in the test, especially when up to 25% of the score is given over to 'looks/fist impressions', which is a matter of individual taste.  The probability of being being described as a plank appears equal to being described as a gun. I suspect many bat makers won’t enter as it’s too risky to business, and maybe some understand the flaws in the test. But there seems to be plenty who feel they have to put one in, even on the experience of poor results in a previous year's test.

So what next?
In presenting a quick critique of the magazine bat tests, it is only proper to offer some idea for an alternative.  However, I don’t think there is one that wouldn't become a huge technical undertaking that would border on being overblown, slightly pointless, and financially silly.  The variability of willow and grading bats on aesthetics rather than performance makes it very difficult to guarantee with any confidence a consistent differential performance through a subjective-only assessment.

Instead, just read the annual magazine bat tests with a practical health warning, don't take them too seriously, and enjoy the read.  Credit them with putting their heads above the parapet and making a big effort to offer an opinion to the public.

1 comment:

  1. Hi David,

    To clear up, the 2011 'Goose was submitted at a time when the company was in the process of about to go under. It was a period where bats were hard to come by. The AOC test was at a later date (March, rather than Jan) than the Cricketer one. Bearing in mind MC Ltd officially entered administration in June of that year, you can see why it was a struggle to get anything special in and a stock bat was entered.

    Generally what 'Goose, and what I understand almost all makers, would do was submit was either a special make up or at the very least bat handpicked from your stock. Perhaps even from the pro shelf.

    Tom

    ReplyDelete