In this world, nothing is certain except death and taxes – or so the old adage goes. So there’s plenty of uncertainty around us, and as statisticians, we’ve become pretty comfortable at living with it. Many of the highest profile statistics that the GSS produces come from sample surveys, and over the years statisticians have developed many clever ways of measuring the inherent uncertainty in these. But we’re much more cautious when it comes to communicating this uncertainty to the people that use our statistics. All too often, it’s only once you delve in to the background notes or the quality reports that you begin to get a sense of the uncertainty in the figures. It’s there that you might find the words of caution about interpreting the statistics or the confidence intervals for the headline measures. And how many of the people that read our publications really make it that far?
From speaking to some fellow statisticians, it seems that part of the reluctance to be up-front about uncertainty comes from fear – fear that people will assume that it means we know almost nothing, that we’ve done little more than pluck a number out of the air. But, as this blog from the London School of Economics points out, “if you can quantify the uncertainty … you have gone a long way towards understanding what is going on.” The LSE blog also argues that communicating uncertainty is essential if we want to improve the way statistics are used and the decisions that are based on them.
The labour market challenge
One of the biggest challenges with communicating uncertainty is how to drag it out of the background notes and build it in to the way we communicate statistics from the outset. How do we get it across that the figures aren’t precise counts, without undermining the value of the statistics that we have produced? And how do we effectively communicate when a change is something worth noticing, rather than a statistical blip? This is exactly the issue that ONS’s labour market team has been getting to grips with lately. When the Bank of England said it wouldn’t consider raising interest rates until the unemployment rate reached 7 per cent, suddenly there was even more attention than usual on the monthly publication and how much the figures had changed. ONS realised that they needed to help people better understand the uncertainty in labour market statistics and how to interpret any changes correctly. By looking at practice from around the world and drawing on the support of GSS colleagues, the labour market team began taking steps in the right direction to make the inherent uncertainty in the statistics a little clearer. If you have a look at their latest publication, you’ll notice the following:
- The key points help explain how the statistics fit in with longer term trends by starting with a description of what has happened over the last two years. The latest figure is compared with the previous year and the previous quarter. This helps demonstrate if a short term change is part of a wider pattern.
- There’s information on uncertainty and how to interpret the statistics quite early on in the publication (page 6). This explains that short term figures should be treated as indicative, with medium and long term figures (along with other data sources) helping to give a fuller picture.
- There’s also a more detailed section on uncertainty later on in the publication which explains where to find the very detailed information on uncertainty
This approach has started to pay off. When these changes were made for the first time in the March publication Anthony Reuben, the BBC’s Head of Statistics, expressed his delight at seeing confidence intervals rescued from the background notes. The BBC even talked about uncertainty in its news story.
A common approach across the GSS?
The GSS is currently putting together some principles on how we communicate uncertainty. There’s never going to be a “one size fits all” solution to this, so the principles are likely to cover a range of approaches which can be tailored to suit the needs of the topic. At the heart of the principles is the idea of “progressive disclosure” – that you start by broadly introducing the idea of uncertainty and gradually disclose more layers of information, which will become increasingly detailed and technical. This means that everyone gets a sense of the uncertainty early on in a publication and the needs of more technical users are also fulfilled.
Some of the ways you can progressively highlight uncertainty in statistics are:
- Use the word “estimate” when describing the numbers (e.g. latest estimates show….)
- Include a plain English explanation about uncertainty early in the publication, like the labour market team did
- Make comparisons across different time periods (e.g. month-on-month, quarter-on-quarter, year-on-year) so that it’s clear if a recent change fits with a longer term pattern
- Use colour coding to highlight relative levels of uncertainty (like in this example from the Welsh Government)
- Give confidence intervals or other technical measures of uncertainty in reference tables
- You might also consider talking about statistical significance when looking at changes in the figures. ONS does this with crime survey statistics – there’s some great commentary around significance on page 21
Ideas from elsewhere
This problem isn’t unique to government statisticians. Communicating uncertainty is a bit of a hot topic across many fields. For example, scientists have been scratching their heads to find the best way to communicate the uncertainty in climate change predictions. Economists want to know the best way to communicate forecasts. Professor David Spiegelhalter is a leading expert in this area. His website, Understanding Uncertainty, contains some great advice and a treasure trove of examples on how to communicate this tricky issue.
What do you think?
There are bound to be other great examples of communicating uncertainty out there, so please let me know if you’ve seen any. I’d also really like to know what you think should be in the GSS principles on this topic. And what about uncertainty in statistics that aren’t based on surveys – where do we start with that?
I’ll look forward to your comments
PS: If you’re still thirsty for more reading on this topic, the Copernicus Institute’s paper Uncertainty Communication: Issues and Good Practice is worth a read to find out more on the idea of progressive disclosure.