My early educational background was in science, and the discipline instilled in me then has stuck with me, even if not much else has! To me science is an iterative process, about recording measurements, evaluating these objectively, and then designing some new tests (experiments) and repeating the process - in order to substantiate (note I didn't say prove) a theory.
In business, quality control and process improvement techniques follows similar lines - Six Sigma is a good example. However, in my experience, in business today the scientific approach is the 'path least trod'. Data is frequently incomplete or inaccurate, or is simply measuring the wrong thing. Managers jump to conclusions, perhaps fearing that patience and diligence will be mistaken for inactivity or indecisiveness.
The result is often impulsive actions that fail to achieve the improvements desired. This despite quite rigid and clear rules of statistical analysis that specify a minimum number of data points needed to make accurate and valid predictions. I vividly remember sharing the early results of some analysis with a Manager, carefully explaining why we needed to collect a little more data before drawing a conclusion. Despite their apparent agreement and wise nods, I had barely left the room before the Manager was sending an email stating 'our conclusions'.
In public life, the problem is made much worse by the plethora of media channels, and the appetite of the media for 'sound-bites' and neat packages taking up just 2 or 3 minutes. An example from the last few weeks here in the UK was an interview on the BBC. Oxford County Council has withdrawn funding for speed cameras in Oxfordshire, and the interviewee tried to give the impression that there would be an '80% increase in speeding' in the county as a result. Where was the science behind this? They had examined the results from 2 cameras (out of 72) over just 5 days since the removal of funding. I was pleased that in a later follow-up interview, a County Councillor was able to make the point that this 'analysis', and therefore the conclusion drawn, was flawed.
This post is not about speed cameras, but by coincidence, another example in the same week happens to be on the same subject. Here a Police Officer publicly accredits a reduction in road deaths to the use of speed cameras. So consider this quote from the UK Office of National Statistics (click here for the full article):
'The total number of deaths in road accidents fell by 7 per cent to 2,946 in 2007 from 3,172 in 2006. However, the number of fatalities has remained fairly constant over the last ten years.'
Firstly, I would suggest that a 7% variation from year to year for one year is statistically insignificant (note ONS doesn't draw any conclusions on this point). Now, just have a think about that second line. This is in a decade when there has been big improvements in vehicle safety - both passive and active*. The role played by speed cameras is at least unclear - we need some more science - proper analysis - to know and understand what's really going on.
Sadly, with a few exceptions, there seems to be little will (or schedule time) for today's journalists to challenge the sometimes outrageous claims made by interviewees. This is even when, as I hope I have demonstrated above, the raw data is in the public domain. I don't believe that there is any excuse for this, and I fear that things will only get worse unless later generations get the grounding in basic scientific techniques that I had, and are thus equipped to challenge, disregard or validate what the media feeds them.
*Passive safety is designed to reduce injury as a result of a collision (i.e. air bags, seat belts), active safety helps you avoid having the collision in the first place (i.e. ABS, ESP).
No comments:
Post a Comment