‘Don’t trust the data. If you’ve found something interesting, something has probably gone wrong!’ Maybe you’ve been there too? It was a key lesson I learnt as a junior researcher. It partly reflected my skills as an analyst at the time – the mistakes could well have been mine! But, not entirely.
You see I was working with cancer registration and deaths data which on occasion could show odd patterns due to changes in disease classifications, diagnosis developments or reporting practices. Take a close look and you could spot the step changes when a classification change occurred. Harder to spot might be the impact of a new treatment or screening programme. But sometimes there were errors too – including the very human error of using the wrong population base for rates.
I was reminded of this experience when Sir Ian Diamond, the National Statistician, spoke to the Health and Social Care Select Committee in May. He said (Q34):
“One of the things about good statisticians is that they are always just a little sceptical of the data. I was privileged to teach many great people in my life as an academic and I always said, “Do not trust the data. Look for errors.””
Sage advice from an advisor to SAGE!
The thing with quality is that the analyst’s job is never done. It is a moving target. In our Quality Assurance of Administrative Data guidance, we emphasise the importance of understanding where the data come from, how and why they were collected. But this information isn’t static – systems and policies may alter. And data sources will change as a result.
Being alert for this variation is an ongoing, everyday task. It includes building relationships with others in the data journey, to share insight and understanding about the data and to keep a current view about the data source. As Sir Ian went on to point out in his evidence, it should involve triangulating against other sources of data.
OSR recently completed a review of quality assurance in HMRC, at the agency’s invitation. It was a fascinating insight into the operation of the organisation and the challenges it faces. We used a range of questions to help inform our understanding through meetings with analytical teams. They told us that they found the questions helpful and asked if we would share them to help with their own quality assurance. So, we produced an annex in the report with those questions.
And we have now reproduced the questions in a guide, as prompts to help all statistics producers think about their data and about quality under these headings:
- Understanding the production process
- Tools used during the production process
- Receiving and understanding input data
- Quality assurance
- Version control and documentation
- Issues with the statistics
The guide also signposts to a wealth of excellent guidance on quality on the GSS website. The GSS Best Practice and Impact Division (BPI) supports everyone in the Government Statistical Service in meeting the quality requirements of the Code and improving government statistics. BPI provides a range of helpful guidance and training.
- Quality Statistics in Government guidance is primarily intended for producers of statistics who need to ensure that their products meet expectations for statistical quality. It is an introduction to quality and brings together the principles of statistical quality with practical advice in one place. You will find helpful information about quality assurance of methods and data and how to design processes that are efficient, transparent and reduce the risk of mistakes. Reproducible Analytical Pipelines (RAP) and the benefits of making our analysis reproducible is also discussed. The guidance complements the Quality Statistics in Government training offered by the GSS Quality Centre.
- Communicating quality, uncertainty and change guidance is intended for producers of official statistics who need to write about and communicate effectively information about quality, uncertainty and change. It can be applied to all sources of statistics, including surveys, censuses, administrative and commercial data, as well as estimates derived from a combination of these. There is also a Communicating quality, uncertainty and change training.
- The GSS Quality Centre has developed a guidance which includes top tips to improve the QA of ad-hoc analysis across the GSS. Moreover, the team runs the Quality Assurance of Administrative Data (QAAD) workshop in which users can get an overview of the QAAD toolkit and how to apply it to administrative sources.
- There is also a GSS Quality strategy in place which aims to improve statistical quality across the Government Statistical Service (GSS) to produce statistics that serve the public good.
Check out our quality question guide and let us know how you get on by emailing me at penny.babb@statistics.gov.uk – we would welcome hearing about your experiences. We are always on the look-out for some good examples of practice that we can feature on the Online Code.