The recent error in the UK government’s test and trace system was a major example of what can happen when there’s no quality assurance in a reporting process and a simple design flaw is not picked up. This case was about as high-profile as they come and the ramifications were potentially disastrous. It’s not the first time that a simple data error has had significant consequences and it certainly won’t be the last. It got us wondering about other prominent cases. What were the ramifications and how can we learn from them to ensure, as far as possible, that data is reliable?
In 2010, a formatting error in a spreadsheet at MI5 caused bugs to be placed on the phone lines of over 1,000 totally innocent members of the public. On closer inspection, all the numbers ended in ‘000’. Instead of the actual numbers, the requests had been made to bug dummy data, which was accidentally left in the final version of the spreadsheet.
The formatting error was fixed and the resulting material destroyed but we’ll never know what MI5 might have missed while they were listening in on hours of innocuous conversation. Since then, all figures have had to be checked manually before any data is requested from telecommunications companies.
In 2010 Harvard professor Carmen Rienhart and Ken Rogoff, former chief economist at the IMF, published ‘Growth in a time of Debt’, a paper that was highly influential among politicians and economists at the time. Their findings were often used to support the austerity measures that were a common feature after the financial crash of 2007/2008. One of the most frequently cited stats from the paper was that when public borrowing reaches 90% of a country’s GDP, its economy will shrink.
It was later shown that an error in the formula had led to this result. When it was corrected, the effect vanished, and the negative growth actually turned into an increase of just over 2%. The error? When calculating the averages, the pair’s formula had missed out data from 5 of the countries in the study. It was first picked up by an undergraduate working on a homework assignment. It just goes to show, even if you are chief economist at the IMF, always check your formulas!
The London Olympics in 2012 were a great success and proved hugely popular with the public, who attended in record numbers. Tickets for certain events were hard to come by. That was, until a typo led to 10,000 additional seats for the qualifying rounds of the synchronised swimming being added to the automated ticketing software. The extra tickets were immediately snapped up by fans eager to get in on the action.
It wasn’t until it came to allocating the seats that the mistake was picked up. The organising committee LOCOG, had to hastily contact the 10,000 additional ticket-holders, offering replacement tickets for other events. Disappointing some and possibly delighting others, who were instead treated to tickets for more high-profile events, at the organiser’s expense.
Once again, these examples show the importance of checking for errors, whether in data entry, calculations or the data-collection process itself. Error detection and quality assurance should always be considered during the design phase of your process. No matter your level of experience or the simplicity of the calculations, you never know when a tiny oversight could lead to much bigger problems. The consequences can be significant and could mean the difference between success and failure.