Standard six of the Standards for Official Statistics in the Code of Practice for Statistics sets out the critical elements of good statistical practice for producing statistics that are fit for their intended uses and are not misleading.
The practices apply across the statistical business production cycle, considering the sources of data, the methods and processes, and assurance arrangements. Good practice relies on an open and inquisitive mindset, effective partnering and collaboration, and capable and confident professional judgement about quality.
The Standard
6. Producers must use suitable data sources and sound methods, and assure the quality of the statistics across the production and release processes – so that the public can have confidence that the statistics are produced in robust ways and fit for their intended purpose
6.1 Produce statistics to a suitable level of quality that means they meet their intended uses and are not misleading
6.2 Use the most suitable data for what needs to be measured. Monitor for changes in the data sources and potential bias in the data. Explain any issues and their implications for use of the data in producing statistics
6.3 Check the suitability and availability of existing data from governmental and non-governmental sources before collecting new data. Ensure that opportunities for data sharing, data linkage, cross-analysis of sources, and the reuse of data are taken wherever feasible
6.4 Maintain constructive relationships with those involved in the data provision, statistics preparation and quality assurance processes. Be clear about your data supply and quality requirements and understand how these will be met. Where possible provide feedback to data suppliers on your use of their data
6.5 Ensure the burden on those providing their information is proportionate to the anticipated benefits
6.6 Use data that are coherent when aggregated, consistent over time and comparable across and within geographies at regional, national and local levels, where possible. Seek to design new statistics in ways that achieve UK-comparability and improve consistency and coherence with related statistics
6.7 Base methods on national or international good practice, scientific principles or professional consensus. Identify potential bias and address limitations. Use recognised standards, classifications and definitions. Explain reasons for deviations from these standards and any related implications for use
6.8 Collaborate with experts, other analysts and statistics producers in the UK and internationally, where appropriate and share best practice
6.9 Use a proportionate quality assurance approach across production and release processes. Validate statistics through comparison with other relevant statistics and data sources where possible
6.10 Verify that the statistics are representative and of suitable quality and monitor relevant quality dimensions for both input data and the statistics, such as completeness and validity, accuracy and reliability, coherence and comparability, and timeliness. Quantify statistical error, including bias, and produce measures of confidence, where possible
6.11 Regularly review strengths and limitations in the data and statistics, including the continued suitability of data sources and methods. Be open about your decisions and reasons for change
Questions to consider
1. Suitable data sources
What data sources are used to produce the statistics? How well does each data source match the concept being measured, in terms of definitions and coverage, for example? What data quality dimensions have been measured and what are the findings for each? What steps have you taken to manage the burden on respondents and data suppliers?
2. Sound methods and processes
How well do the statistics meet the needs of users? Are the methods and processes used appropriate for the intended uses of the statistics? What quality criteria were considered in choosing methods and processes? How do you triangulate the statistics against other sources? What feedback have users given about the choices of methods and processes?
3. Uncertainty
How accurate are the statistics? How well do they reflect the real-world? Are there any biases in the data, methods or the statistics? What is the level of sampling variability? Are data records missing important information? Are you able to minimise the impact of missing data?
4. Quality issues
How are you monitoring the quality of the statistics over time? Are there any recent changes to data or methods that affect the statistics? Are the data internally coherent? Are harmonised questions and standard classifications used where possible? Are the statistics comparable with other related statistics, such as within local areas and across the UK, and show credible patterns?
5. Timeliness
What is the timeliness and frequency of the statistics? How were these determined? Which user needs does this timeliness and frequency meet and are there any user needs that are not met? Does the lag between the reference period and publication impact the value of the statistics? Could a lag or a focus on recent data lead users to be misled?
Related guidance
Office for Statistics Regulation:
- Spotlight on Quality Framework: Assuring Confidence in Economic Statistics
- Thinking about quality when producing statistics
- Administrative data quality assurance (QAAD) toolkit
- Quality assurance of management information
- Comparability framework tool
- Collecting and reporting data about sex and gender identity in official statistics
- Guidance for producers when making changes to statistical methods
- Guidance for models: Trustworthiness, Quality and Value
Government Statistical Service (GSS):
- Quality Statistics in Government
- Quality Assurance
- GSS Quality Guidance
- Tips for urgent quality assurance of ad-hoc statistical analysis
- Tips for urgent quality assurance of data
- Meet the data quality dimensions
- Harmonised standards and guidance
- Introducing the Administrative Data Quality Question Bank
- Inclusivity and accessibility in survey development
- Standards for ethnicity data
- Monitoring and reducing respondent burden
- ESS Quality dimensions
- Quality questions and red flags
- Quality Questions
- Quality assurance of code for analysis and research
- The AQUA book: guidance on producing quality analysis for government
- Quality of administrative data in statistics
- Reproducible Analytical Pipelines (RAP)
- Coherence of statistics
Guidance for surveys:
- Survey development kit
- User-centred design approach to surveys
- Top tips for maintaining quality when designing surveys at pace
- Remote testing survey questions
UK Statistics Authority:
Government Digital Service
Market Research Society:
- MRS Guidelines for Conducting Data Collection Activities with Children
- MRS Best Practice Guide: Research Participant Vulnerability
- AI Guidance and using Related Technologies
National Audit Office:
Survey Futures:
UNECE:
Good practice examples: Harmonisation and coherence
- Harmonisation Team Office for National Statistics: Update on the review of the ethnicity harmonised standard: additional work to explore potential new response options
- Office for National Statistics and partner organisations: Analytical Learning Points for Ethnicity Data in health administrative data sources
Blogs:
- Cabinet Office: Improving ethnicity data quality in the public sector
- Office for National Statistics: The journey to improving income-based poverty statistics
- English Health Statistics and Steering Group: Driving the change to make health statistics more coherent
- Office for National Statistics: Building a better understanding of UK health data
Good practice examples: Data sources
- Department for Work and Pensions – applying the Quality Assurance of Administrative Data (QAAD) framework to survey data: Family Resources Survey: quality assessment report
Case studies:
- Ministry of Justice: Ensuring source data is appropriate for intended uses
- Office for National Statistics: Publishing information about data quality assurance processes
Good practice examples: Quality assurance
- Department for Work and Pensions: Improving quality assurance and its communication to aid user interpretation
Good practice examples: Automation
Blogs:
- Analysis Function: RAP: it’s all about the code
- Analysis Function: How to RAP: bringing good practice to life
Good practice examples: Algorithms
- Centre for Data Ethics & Innovation: Blog- The Algorithmic Transparency Recording Standard
Good practice examples: Quality management for analysis
- Government Data Quality Hub: Blog- Getting the ingredients right for quality analysis
