Over the years, we’ve seen several research studies throw up results that are unexpected to say the least, and in many cases, outright incredible. As you’ll be well aware, the vast majority of research studies are undertaken with the express intention of validating a hypothesis that most concerned parties already assume to be correct. Sometimes though, a study can throw up a set of results that disproves a hypothesis completely, or suggests an answer that no-one in their right mind would have ever expected…

 

Unfortunately, the principles of Occam’s Razor apply as much to the world of data capture and research as they do to anywhere else. A useful rule to apply to research studies is that the more surprising a result seems to be, the less likely it is to be correct.

 

Just this year, we’ve seen a high-profile study into the genomes of centenarians suggest a genetic propensity for long life, and a study at the OPERA facility in Switzerland point to the idea that neutrino particles could travel faster than the speed of light. Both studies provoked widespread shock, disbelief and of course publicity, but further examination into their credibility revealed discrepancies in the data that undermined the validity of the findings; some would suggest inevitably.

 

The problem is that studies that throw up unexpected and incredible results are likely to attract widespread public and media attention, purely because they are so shocking. Of course, by the time that the findings have been made public, any evidence that suggests the results may be invalid will undermine the study on a large scale, causing embarrassment for the parties involved and provoking the ire of the scientific community as a whole.

 

Studies such as these, which claim incredible findings that cannot be sustained under prolonged scrutiny, just highlight the importance of accurate data capture, analysis and processing procedures. The OPERA experiment generated its inaccurate results thanks to ‘a loose fibre optic cable’ that was involved in measuring the speed of the much-vaunted neutrinos, so faulty equipment was to blame for their aberrant data points. In the genome study however, ‘technical errors’ and ‘inadequate quality control’ were blamed for the peculiar set of results generated by the study. In either case, more stringent control over the data capture and analysis protocols would have saved the institutions involved widespread embarrassment.

 

In studies of this magnitude, it is vital that data capture techniques are as accurate and infallible as possible, but their failures can act as an example to the rest of us. Your studies needn’t fall victim to inaccurate data points; contact us for more information on how our products and services can help you accurately and efficiently achieve the data capture results you need.