How can official statistics better serve the public good?

How good are government statistics? In a recent seminar we asked members of the Government Statistical Service for three words they would use to describe Government Statistics. Among the top words we got back were ‘trustworthy’, ‘quality’ and ‘informative’. It was striking how closely these aligned to the three pillars of our Code of Practice for Statistics – Trustworthiness, Quality and Value – and encouraging to us, as the regulator of official statistics, to hear our message echoing with others 

Official statistics play a central role in answering society’s most important questions. The most salient questions currently facing society concern the COVID-19 pandemic, its impacts and societal responses to it. Data and analysis have been crucial in informing government and individual’s decisions and supporting public understanding.  

But the uses of official statistics extend far beyond the pandemic into peoples’ everyday lives: whether you are making decisions as a head teacher, or choosing your child’s school, developing policy on social housing, or trying to decide whether and where you should buy a house, have an interest in your local library remaining open, or are considering the country’s major economic decisions, you may well be using official statistics. This is why it’s so important that the UK’s statistical system responds to society’s information needs with insightful statistics.  

In a world of increasingly abundant data, expectations are higher. Individuals have become accustomed to information on many aspects of society in near real time with increasingly detailed breakdowns. Official statistics need to respond to these demands for information. Our work as a regulator of official statistics puts us in a unique position to reflect on the UK government statistical system and, in July, we set out our view on the current state of government statistics 

At their best, statistics and data produced by government are insightful, coherent, and timely. They are of high policy-relevance and public interest. During the COVID-19 pandemic, we’re seeing the kind of statistical system that we’ve always wanted to encourage – responsive, agile and focusing on users. However, the statistical system does not consistently perform at this level across all its work. In our report we address eight key areas where improvements could be made across the system. 

  1. Statistical leadership 
  2. Voluntary Application of the Code, beyond official statistics 
  3. Quality assurance of administrative data 
  4. Communicating uncertainty 
  5. Adopting new tools, methods and data sources 
  6. Telling fuller stories with data 
  7. Providing authoritative insight 
  8. User engagement 

In each area, as well as talking about what we would like to see, we highlight examples of statistical producers already doing things well, which others can learn from and build on.  

Our 5-year Strategic Business Plan sets out our vision and priorities for 2020-2025, and how we, as OSR, will contribute to fostering the Authority’s ambitions for the UK statistics system. In all our work, we will continue to champion the work producers do, celebrate the things they do well, and encourage them to continue to improve the statistics they produce so that, together, we can ensure that official statistics better serve the public good.  

Please get it touch if you’d like to discuss the report further. 

An analyst’s job is never done

‘Don’t trust the data. If you’ve found something interesting, something has probably gone wrong!’ Maybe you’ve been there too? It was a key lesson I learnt as a junior researcher. It partly reflected my skills as an analyst at the time – the mistakes could well have been mine! But, not entirely.

You see I was working with cancer registration and deaths data which on occasion could show odd patterns due to changes in disease classifications, diagnosis developments or reporting practices. Take a close look and you could spot the step changes when a classification change occurred. Harder to spot might be the impact of a new treatment or screening programme. But sometimes there were errors too – including the very human error of using the wrong population base for rates.

I was reminded of this experience when Sir Ian Diamond, the National Statistician, spoke to the Health and Social Care Select Committee in May. He said (Q34):

“One of the things about good statisticians is that they are always just a little sceptical of the data. I was privileged to teach many great people in my life as an academic and I always said, “Do not trust the data. Look for errors.””

Sage advice from an advisor to SAGE!

The thing with quality is that the analyst’s job is never done. It is a moving target. In our Quality Assurance of Administrative Data guidance, we emphasise the importance of understanding where the data come from, how and why they were collected. But this information isn’t static – systems and policies may alter. And data sources will change as a result.

Being alert for this variation is an ongoing, everyday task. It includes building relationships with others in the data journey, to share insight and understanding about the data and to keep a current view about the data source. As Sir Ian went on to point out in his evidence, it should involve triangulating against other sources of data.

OSR recently completed a review of quality assurance in HMRC, at the agency’s invitation. It was a fascinating insight into the operation of the organisation and the challenges it faces. We used a range of questions to help inform our understanding through meetings with analytical teams. They told us that they found the questions helpful and asked if we would share them to help with their own quality assurance. So, we produced an annex in the report with those questions.

And we have now reproduced the questions in a guide, as prompts to help all statistics producers think about their data and about quality under these headings:

  • Understanding the production process
  • Tools used during the production process
  • Receiving and understanding input data
  • Quality assurance
  • Version control and documentation
  • Issues with the statistics

The guide also signposts to a wealth of excellent guidance on quality on the GSS website. The GSS Best Practice and Impact Division (BPI) supports everyone in the Government Statistical Service in meeting the quality requirements of the Code and improving government statistics. BPI provides a range of helpful guidance and training.

  • Quality Statistics in Government guidance is primarily intended for producers of statistics who need to ensure that their products meet expectations for statistical quality. It is an introduction to quality and brings together the principles of statistical quality with practical advice in one place. You will find helpful information about quality assurance of methods and data and how to design processes that are efficient, transparent and reduce the risk of mistakes. Reproducible Analytical Pipelines (RAP) and the benefits of making our analysis reproducible is also discussed. The guidance complements the Quality Statistics in Government training offered by the GSS Quality Centre.
  • Communicating quality, uncertainty and change guidance is intended for producers of official statistics who need to write about and communicate effectively information about quality, uncertainty and change. It can be applied to all sources of statistics, including surveys, censuses, administrative and commercial data, as well as estimates derived from a combination of these. There is also a Communicating quality, uncertainty and change training.
  • The GSS Quality Centre has developed a guidance which includes top tips to improve the QA of ad-hoc analysis across the GSS. Moreover, the team runs the Quality Assurance of Administrative Data (QAAD) workshop in which users can get an overview of the QAAD toolkit and how to apply it to administrative sources.
  • There is also a GSS Quality strategy in place which aims to improve statistical quality across the Government Statistical Service (GSS) to produce statistics that serve the public good.

Check out our quality question guide and let us know how you get on by emailing me at – we would welcome hearing about your experiences. We are always on the look-out for some good examples of practice that we can feature on the Online Code.

COVID-19 Testing Data

UKSA Chair Sir David Norgrove has written to Matt Hancock, Secretary of State for Health and Social Care to reiterate concerns with the official data on testing and highlight the importance of good data as the Test and Trace programme is taken forward.

Statistics published by government should shed light on key issues. They should enable the public to make informed decisions and hold the government to account. The public interest in data around COVID-19 is unquestionable, we have seen this come through our media and social media monitoring as well as from the emails we have been receiving.

The government have made a commitment to improve the information available on COVID-19, including additional data on COVID-19 testing and testing capacity, which are now being published as well as a commitment to provide greater clarity on data collection methods and associated limitations which we look forward to seeing.

However, as Sir David Norgrove said in his letter, the data still falls short of the expectations set out in the Code of Practice for Statistics.

In Sir David’s letter he sets out his view that the testing data should serve two purposes:

  1. To support understanding of the prevalence of COVID-19, including understanding more about where infections occur and who is infected.
  2. To manage the testing programme – and going forward the approach to test and trace. The data should allow government and the public to understand how effectively the programme is being managed.

The data currently published are not sufficiently clear or comprehensive to support these aims.

The Office for Statistics Regulation champions statistics that serve the public good and we will continue to work with officials in the Department of Health and Social Care as it works hard to develop these important data.

Statistical leadership: making analytical insight count

Our vision is statistics that serve the public good. To realise this vision, the people who produce statistics must be capable, strategic and professional. They must, in short, show leadership. Effective statistical leadership is not just down to the most senior statistician in each organisation – as important as they are – but also requires individuals at all levels and across professions to stand up for statistics and champion their value.

In support of this, we initiated a review of statistical leadership in government, underpinned by the expectations set out in the Code of Practice for Statistics. Through our review we hope to support an environment in which:

  1. statistics, data and analysis are used effectively to inform government decisions and support society’s information needs.
  2. statisticians – and other analytical professions in government – feel empowered to provide leadership and feel positive about their career development and prospects.

We are sharing some of the early findings from our review to highlight the work and prompt further discussion of this important topic. If you have any comments or would like to speak to one of the team please find contact details on the review page or email

What we aim to achieve

Based on our review to date we have identified four outcomes we would like to see which form the focus of our future work on statistical leadership.

  1. The value of statisticians and other analysts is understood by influencers and decision makers, and they see the benefits of having them at the table

It is critical that analysts are involved as policy and performance targets are developed. Our review suggests that while there are examples of statisticians being highly valued and involved in policy development throughout the process, there are also occasions where this is not the case. We found that where statisticians are engaged in policy and understand the context, they are more likely to be valued by colleagues and therefore more engaged. Which in turn helps to ensure that statistical evidence is at forefront of decision making and debate. The 2018 Civil Service People Survey shows that 79 per cent of statisticians who responded to the survey felt they had a good understanding of their organisation’s objectives. While it is on a par with the all civil service response (also 79 per cent), it compares with 82 per cent for social researchers, 83 per cent for economists and 84 per cent for communications specialists.

We plan to highlight the value of analysts to decision makers, and use our influence to advocate the value of statistical insights and strong statistical leadership. We will also work with statisticians to help them articulate why they are valuable to decision makers and to ensure they have a good understanding of the policy or organisational context they work in.

  1. People have confidence in the statistical system and its ability to answer society’s most important questions.

The Code of Practice for Statistics sets out clear expectations that organisations should assign a Chief Statistician/Head of Profession for Statistics who upholds and advocates the standards of the code, strives to improve statistics and data for the public good, and challenges their inappropriate use. The code is also clear that users should be at the centre of statistical production, with producers considering both known and potential user views in all aspects of statistical development, including in deciding whether to produce new statistics to meet identified information gaps. Statisticians have a duty to uphold the code which gives them a unique responsibility compared with other analytical professions.

It is clear that statisticians face challenges in the competing demands between departmental priorities and serving wider user needs, which also require engagement and resource. However, having ambition, encouraging innovation and viewing the statistical system as a whole are essential aspects of effective statistical leadership. In our role as regulators we are in a position to support statisticians in upholding the code as well as highlighting the importance of this aspect of their roles to those they report to. We will do much of this through further targeted engagement, but will also be supported by our research programme which is exploring the broader public value of statistics and data for society.

  1. Statisticians feel empowered to provide leadership

For statisticians to deliver they need to have structures that support them. There are a range of structures in different departments, relating to where the statisticians sit and how they are managed. In some instances, teams are formed solely of statisticians, sometimes they are cross analytical and sometimes statisticians sit within policy or communications teams. Each of the scenarios comes with its own advantages and disadvantages. For example, we have heard that when statisticians are based in policy teams, they tend to have a better understanding of the policy context, are more valued by decision makers and are more likely to input into key decisions. However, there is potential for these statisticians to have less support on upholding the code or drawing on technical expertise. We also know that the ability of the Head of Profession and statisticians more broadly to have influence can vary, depending on organisational culture or structure. For example, whether they have dedicated professional time and support, the level of delegated responsibility, and the grade and broader skill set of the statisticians concerned. To be effective and valued in all circumstances, the ability to be pragmatic in addressing (and anticipating) the needs of decision makers, while retaining professional integrity is key.

There are also strong links between statisticians feeling empowered to provide leadership and the ability of organisations to demonstrate good practice through collaboration and innovation. Statisticians also need fit-for-purpose systems to showcase their value. These are essential pre-requisites for statistics, data and analysis to be used effectively to inform government decisions and support society’s information needs.

We want to make sure statisticians (and analysts more broadly) have what they need to be effective, as well as identify any barriers to effective leadership and use our influence to overcome them. We will not make recommendations for specific structures and management approaches but will provide examples of practices which support different management structures and demonstrate how organisations have overcome some of the barriers presented by different approaches.

  1. Statisticians feel positive about their own career development and prospects

One of the concerns raised through the review is about loss of talent due to a lack of senior analytical roles. In the 2018 Civil Service People Survey, 90 per cent of statisticians who responded said they were interested in their work. However, 16 per cent said they wanted to leave their role within the next 12 months (compared with 13 per cent for all civil servants).

Statisticians may move outside of statistical roles to progress their careers, which if well managed has advantages for statistical leadership across an organisation, but there should be better structures to make sure that individuals are able to return to statistical and analytical (including leadership) roles in government and not be permanently lost to the profession.

There were also concerns raised about the talent pipeline and statisticians not always being used or developed to their full potential. It should be clearer that there are a range of career and skills development paths for statisticians at all levels, including technical routes for those who want to pursue this, and a focus on softer skills for those who want to take on leadership and more policy facing roles. This should be supported through enhanced and structured opportunities for statisticians to develop a broad range of skills throughout their careers.

We plan to work with those who deliver talent management and mentoring programmes, including the GSS People Committee to champion the need for effective career support and management for statisticians, including development programmes, secondments, shadowing and other opportunities to work in a range of settings, including getting exposure to policy or delivery facing roles. We will also work with groups like the GSS People Committee to make sure that the training that is on offer to statisticians is clear and work with Heads of Profession to help them understand what less senior statisticians need from them.

A blog like this cannot do justice to the range of issues highlighted, but we hope this gives a sense of our thinking and plans. We would welcome your views on what we have covered. Please do watch this space for further reports and engagement.


A matter of fact

How to get on and influence people or what you need to know if you want to be part of important policy decisions.

Penny Babb, Head of Policy and Standards in the Office for Statistics Regulation, describes how thinking about Trustworthiness, Quality and Value can help policy professionals in their careers.

Policy professionals are at the heart of designing and delivering the work of government. Your judgement about evidence is crucial in developing sound policies and understanding their implications and impact. To be able to effectively assess evidence and advise on solutions, your decisions need to be based on more than just instinct – a gut reaction to what sounds like a great idea can lead to millions of pounds of public money being wasted. But get it right and you can transform the lives of millions of people in the UK.

How do you know what evidence you can trust, or decide if the figures are reliable or are the ones you need?

Keep it simple and consider three things: Trustworthiness, Quality and Value.



Ask who produced the figures – how do you know you can trust them in the way that they produce their figures and use data?


What is the quality of the data and how robust are the methods? How sure can you be about the evidence?


Do the figures measure what you are interested in? Can you make sense of the patterns and trends?


Trustworthiness, Quality and Value are the fundamental pillars of the Code of Practice for Statistics. The Code sets out what you as users can expect from the producers of official statistics, and the standards they should work to, to ensure that you can have confidence in their statistics.

The pillars are useful for other data, statistics and analysis, going beyond official statistics. For example, they are being applied by analysts in Department for Work and Pensions and by the Greater London Authority.

You can also apply the pillars in your own work. Think about when you produce ministerial advice, and ask yourself how the Minister can have confidence in your briefing:

  • How have you demonstrated your trustworthiness in the way that you have organised and managed the information?
  • How reliable is your evidence and how certain are you about your conclusions?
  • How have you provided value in your advice?

The Policy Profession Standards includes competencies related to statistics and data analysis under Evidence – Analysis and use of evidence.

Applying the Code pillars can help you succeed as a policy professional by developing and demonstrating your analytical skills and so providing impartial, well-considered advice based on sound evidence.

The Office for Statistics Regulation produced the Code of Practice for Statistics. We have mapped the principles in the Code to the competencies in the Policy Profession Standards for each of the three professional levels. Use this matrix to see which parts of the Code can help you develop your skills­­ in for statistics and data analysis at each competency level.

Select the image below to download a PDF version:

Select this image link to download a PDF version


The Good Practice Team (GPT) in the Government Statistical Service has published a booklet with ten things you need to know about statistics, aimed at policy professionals across central government. It introduces some key ideas and concepts to help you to ask the right questions about data and statistics.

Producing quality, trustworthy and valuable information? Why wouldn’t you?

Mairi Spowage, Deputy Chief Executive at the Scottish Fiscal Commission, describes using the Code of Practice for Statistics in a forecasting organisation.

First, a bit of background to the Commission. The Scottish Fiscal Commission is the independent fiscal institution (IFI) for Scotland, set up to produce independent forecasts of the Scottish economy, income tax, VAT, devolved taxes and devolved social security. We have a different remit and way of operating from the Office of Budget Responsibility (OBR), the IFI for the UK. We are not considered an official statistics producer.

I moved to the Commission in 2016, to help set it up, establish the modelling teams, and recruit the analytical staff. The requirements of these roles meant we required a mixture of economists and statisticians.

I used the Code of Practice for Official Statistics 1.0 extensively in previous roles as a statistician in the Scottish Government. Previously, I was responsible for the production of the National Accounts and public sector finance projections for the Scottish Government. These are high profile statistics, and the Code was an important tool in both ensuring and demonstrating independence of production. It gave us a helpful framework for interacting with users and Ministers.


“I became a cheerleader …” 

Why was I interested in bringing this experience into the Commission? Firstly, it is worth saying that as an IFI, we try to adhere to the OECD Principles for these bodies. I was struck by how well many of these principles chime with the Code – at their heart they encourage transparency, accessibility, independence of production and user engagement.

I became a cheerleader for the principles of the Code within the organisation. The first objective was to get buy-in from members of the Commission and our Chief Executive. Given the synergy of the Code with the IFI principles, they were wholeheartedly behind using this framework to shape our work and demonstrate independence, transparency, high quality analysis and user engagement.

I ran sessions on Code 1.0 with new staff as they joined, emphasising how it helps to ensure orderly release and transparency of production. At this point we hadn’t made a public statement about the Code, but the rigour of statistical production and the principles of openness were being brought into all the teams as they were formed. It is helpful to highlight practical examples which show the value of working within this framework.

“This has become a valuable way of working…”

The new (draft at this point) Code 2.0, with its aim to apply to the wider community of analysts producing numerical information, made it easier for us to make a public statement about what had become an important part of our way of working. Ensuring trust in us as an organisation, demonstrating the quality of our methods, and making sure that our outputs have the maximum value to our users is central to our approach. This has become a valuable way of working for both professions in the Commission, not just for the statisticians.

We released our first publications in September 2017 and organised user events to ensure we were meeting user needs. We implemented an extensive, robust and documented quality assurance process as part of our forecast production in December 2017.

Since then we have both implemented and publicised other elements of our work to demonstrate our efforts to stick to the spirit of the Code. These are simple measures, but are practical things that organisations can do to be more accessible, open and transparent.

Advanced notice

We now publish a monthly forthcoming publications email and news story, to highlight the month of publication up to a year ahead and the exact date at least 6 weeks in advance. It allows us to be in touch with users regularly, especially important in between forecast rounds, when we are likely to be publishing more specialist working papers.

It also allows us to demonstrate and publicise our outputs, so users know when to expect our publications. By being transparent about when we will publish outputs, and sticking to publication dates, we also demonstrate our independence.

Have policies in place

We have developed a corrections and revisions policy which allowed us to publicise transparently how the Commission will deal with any analytical errors or revisions in our work. I know that this can be uncomfortable for some organisations: obviously, the best scenario is one in which the corrections policy never needs to be used. However, anyone who has worked on large analytical documents will know that, no matter how rigorous quality assurance procedures are, errors will sometimes happen. Setting out how these will be dealt with in a proportionate manner is part of building user confidence in an organisation which is a mature, confident and professional provider of analysis.

Another simple policy is that from the first statutory publication in September 2017, we have assured that every table and chart in our document is made available in spreadsheet form at the time of publication. This may seem like a small point, but it is amazing how often this is not done and leaves users struggling to reuse information. This is a really easy way to increase public value and ensure snapshots of published information are preserved in an organisation.

“We are keen to meet user need for information and analysis …”

At the heart of our efforts to comply with the real spirit of the Code is user engagement, and in particular responsiveness to user feedback. An example of this is the additional paper we produced on income tax in March 2018 that focused on how we had estimated the taxpayer behavioural response in December 2017. There was huge interest in our estimate of the tax raised from the Government’s announced income tax policy and a clear desire for more information. Our additional paper also included some analysis that, due to user feedback, we will now include in any analysis of future policy changes.

We held an event where we presented this analysis to interested users. We are keen to meet user need for information and analysis where it can shed light on areas within our remit.

“…demonstrate our commitment to the Code…”

We brought together all of our policies which demonstrate our commitment to the Code in one document which we published in March 2018.  We plan to add to this over the months and years to come as we find new ways to be transparent, responsive and accessible. Because our staff come from a mixture of backgrounds, including those with experience of producing official statistics, we also aim to promote a collaborative environment where different analytical professions continue to learn from each other.

As I’ve gone on about how we value user feedback, I’d better say that you should let us know if you have any ideas about how the Commission can improve and expand this, whether in the document, or in practice.

So, drop us an email at if you want to get in touch.

Improving and innovating: enhancing the value of statistics and data

Lessons from statisticians producing Children, Education and Skills statistics

Statistics are of value when they support society’s need for information; they should be useful, easy to access, remain relevant, and support understanding of important issues. To help deliver this, producers of statistics should commit to continuously improve their service to users.

I have been part of the team working on the refresh of the Code of Practice of Statistics. There have been various changes within the Code but without a doubt the area which I am most excited to see enhanced is the new Innovation and improvement principle. At the Office for Statistics Regulation we have always expected producers of statistics to adapt so statistics can better serve the public but now this expectation is crystallised in the code.

During conversations about the development of the Code I received several questions about this area and I felt there was a sense of nervousness about how it might be applied; this is understandable with anything new. The new principle is about having a positive mindset to change and improvement across all statistics production and dissemination processes. However, the practices which sit beneath the principle are not a prescriptive list of what must be done instead they should be applied proportionately depending on the statistics in question. As a result, how producers respond to this principle will differ in scale and approach. What is most important is producers’ motivation to improve their statistics.

I was keen to undertake a small project to help support producers of statistics get a better handle on what the Innovation and improvement principle meant for them. My colleague Louisa and I both focus on Children, Education and Skills (CES) statistics. This thematic way of working gives us the opportunity to better understand policy and statistics issues, and develop relationships with a range of users and producers of CES statistics. From our ongoing conversations we were aware of several innovations in this area of statistics, such as work to develop the Longitudinal Education Outcomes data which is relatively well known about. We wanted to find out more about other projects however, to learn about less well publicised developments or smaller scale projects which, nonetheless, reflect an ambition to improve the value of the statistics.

We started by asking producers of CES data and statistics across the UK to send us information on the projects they had been working on. We were pleased by the range of responses we received. The projects, whether completed or still in development, varied in scale and covered everything from producing statistics using Reproducible Analytical Pipelines to improved accessibility to data.  It was clear to us that improvement, whether big or small, was embedded in much of the activities of producers – it was great to hear just how enthusiastic producers were about their projects. We also spoke with users to get their feedback on some of the development work, to find out how they have benefited from the improvements being made. Here is a link to a summary of the innovation and improvement projects producers told us about.

Over the coming weeks we want to share with you some of the common themes that became apparent from talking with producers and users linked to these projects. Firstly, we want to look at the importance of collaborative working when developing statistics, then at the development of alternative statistical outputs, and finally at some of the common challenges producers face when improving their statistics. While these themes have come from examples taken from Children, Education and Skills statistics it is intended to give all producers of statistics a better sense of what the new Innovation and improvement principle might mean to them and highlight elements of good practice we might expect to see when assessing statistics.

As this review is considering innovation in statistics, we ourselves wanted to be more creative in thinking about how we would share our findings. Instead of a more traditional report, we are going to publish our work across a series of web posts.  We will also be exploring, with the Government Statistical Service’s Good Practice Team, how else we might support producers undertaking innovation and improvement work.

For now, keep an eye out for our forth coming posts, and if you want to get in touch, linked to this review or on CES statistics more generally, please do email.


Maximising the value of statistics through systemic reviews

It is vital that the statistics produced by government and other official bodies are as valuable as they can be to society. Statistics should be easy to access, relevant, and help the public understand important questions such as ‘How many people work in the UK health services?’, ‘What is the nature of the housing shortage?’ and ‘Are living standards going up or down?’

But, these are questions that can’t be answered with one statistic alone– and in some cases can’t be answered without evidence that goes beyond statistics as well. To complicate matters, in the UK system, there are multiple bodies producing statistics on each of these themes. To really understand the public value of statistics, it is important to look across a set of related statistics, as well as looking at the statistics individually. This is why we carry out systemic reviews – reviews that explore a set of official statistics in a thematic area or on a cross-cutting topic.

These reviews allow us to look at the relationships and signposting between statistics and to spot gaps and overlaps. The work involves discussions with people who use and produce statistics and a great feature is our ability to do more than publish recommendations – also intervening to support change directly, for example facilitating stakeholder meetings or other events.

To date we have reviewed health and social care statistics in England, and across the UK, statistics on income and earnings, city regions, crime and justice, housing and planning and international migration. We are currently looking at data linkage and innovations in statistics in the education area.

Each systemic review has identified examples of good practice. These include statisticians generating new statistics or analyses in response to user requests and presenting material in a user-friendly way. For example:

  • ONS now has a measure of cyber crime,
  • official labour market statistics can be accessed via the NOMIS platform
  • the Migration Statistics Quarterly Report combines statistics on a related topic in one publication

Each systemic review has also found areas for improvement. A common one is users of statistics finding it difficult to access what they need – for example there are four main producers publishing statistics on mental health in England which causes confusion as people can’t easily access the information they need from one place.

Users also face problems understanding how definitions vary for similar statistics, for instance on housing affordability across different parts of the UK. And it can be hard for users to interpret what a set of statistics collectively show – as in the case of various data sources on income and earnings not giving a consistent message on the state of living standards. In some cases there are gaps where statistics are not available to shed sufficient light on a topical issue – such as the rents paid by those living in private rented housing, for sub-national geographies.

Often we are able to identify changes in the way the statistics system works that can generate improvements – simple things like groups bringing together different producers at strategic and working levels and user-engagement strategies can really help prioritise efforts to improve the public value of statistics. It’s great to see the English Health and Care Statistics Steering Group set up to work across health and social care statistics bodies in response to our work identifying a lack of strategic leadership; this has now delivered changes to statistical outputs – for example ‘Statistics on Smoking – England’ which combines figures from different Departments to help users. We also like prioritised plans for statistical developments, for example ONS’s Economic Statistics and Analysis Strategy.

We consistently find that statistics producers are motivated to use data to drive improvements in policy and society. But we also find resource constraints. We strongly believe that speaking to a wide range of users of statistics and understanding their questions, innovating and collaborating across the system are vital for doing the best job statisticians can – and we also recognise that this can be challenging.

We would love to hear about your good examples and any challenges you face meeting the public value expectations of the Code of Practice. We also welcome suggestions for ways we could support improvements.

‘Truth surge’: celebrating the success of statistics

“After the fears about post truth, we’ve now got the truth surge”.


So said Rupert McNeil, Civil Service Chief People Officer, to me at an event to launch the UK Statistics Authority’s consultation on the new Code of Practice for Statistics last summer.

We’ve now published the new Code itself. And I want to return to Rupert’s idea of a truth surge in the context of one of the great unsung features of the Civil Service. At the same time, I want to celebrate statistics as one of our quiet successes – and explain why we shouldn’t be so quiet about it.

We talk a good deal about the roles of the Civil Service – providing sound and honest policy advice to ministers, ensuring the public receive excellent services, and delivering value for taxpayers’ money.

There’s another crucial role that we talk about less: providing high quality and trustworthy data and statistics into the public domain.

These statistics are a crucial building block of how people make decisions about the world: for businesses, on where and how to invest, and which markets to develop; for individuals, on where to live and where to send children to schools; and for communities, on how to improve the quality of local life.

At the Office for Statistics Regulation all our work starts from recognising that statistics are a public asset. They are the lifeblood of democracy. And, as Anthony Hilton put in an Evening Standard article, they are as essential to public life as physical infrastructure like water or electricity.

The quiet triumph I want to celebrate is the capacity of the Civil Service to provide these clear, trusted data and official statistics to the public, regardless of the political colour of the party running government. These statistics cover a vast range of activity, from the health of the population, to the number of households below average income, via the performance of schools, the quality of the natural environment, and the disparity in outcomes between different groups in society.

Life and death, war and peace, crime and punishment – all human life is here. And it’s done with the rigour, integrity, and competence that are the hallmarks of a brilliant Civil Service.

Many countries would struggle with this. They only recognise as official statistics the data produced by their independent statistical offices (the counterparts of the Office for National Statistics – ONS).

So in many countries there are two or three providers of official statistics. Here there are over 150. Elsewhere in the world, the idea that a department run by an elected minister could produce statistics to the same standard as an ONS-like body, would be hard to imagine. In the UK, with our strong institutional framework for an impartial Civil Service, we take it as read.

It’s a testament to the strength of Civil Service values through the professionalism of individual civil servants and of their departments and agencies as institutions. As such, this reliable provision of statistics is part of the fabric of sound public administration.

This hasn’t happened by accident. It’s the result of the accumulated experience of generations of civil servants – statisticians, other analysts and policymakers.

The Code of Practice for Statistics captures this accumulated experience – it’s the go-to guide for any public organisation publishing statistics.

So why do I think this shouldn’t be a quiet success anymore?

We live in era of fear, of post-truth, spin and filter bubbles; where stories abound of fake news. It’s also a world of increasingly abundant data – but where it can be harder to sift the reliable from the unreliable.

In this world of scepticism, the ability of the Civil Service to provide statistics that are trustworthy, of high quality and publicly valuable is very important – to government and to society.

That’s why we have redesigned the Code of Practice to focus on the three pillars, or outcomes, Trustworthiness, Quality and Value, the core foundations of public confidence. It’s why we emphasise statistics not as an arid or technical subject but as a public asset and the lifeblood of democracy. The refreshed Code of Practice for Statistics can help you use data and statistics appropriately, whether you are working in policy, operations or statistics and analysis, view it here

And it’s why Rupert McNeil is right: we need a surge. We need to shout from rooftops that statistics are worth defending as one of the most crucial functions of government.

So, we think our new Code is not just the foundation of public confidence. It’s more than this: it is a celebration of statistics as one of the key achievements of the British Civil Service.

The new Code of Practice is coming…

The new Code of Practice will be published next Thursday.

We’re really grateful for the huge amount of thought and effort you’ve contributed throughout our consultation process. It’s been amazing to see how much interest and enthusiasm the Code has generated.

I won’t give away too much, but a few things to look out for:

  • the Code will be based around the three pillars of trustworthiness, quality and value
  • there will be new interactive pages on this website, with links to guidance
  • we will also consult on our draft guide for voluntary application of the Code beyond official statistics, both inside and outside Government.

Our key message throughout this is that statistics are the lifeblood of democracy. The Code is built around public confidence in this essential public asset.

The Code will be used by statisticians and analysts on a daily basis. But it’s got a much wider reach. It helps Government organisations demonstrate that they live up to the highest standards. And it helps citizens have confidence in the statistics that describe the community and wider society they live in.

So we’re pretty excited to share this Code with you next week.