Intelligent transparency – or how the sausage gets made

In our latest blog, Regulator Anna discusses our intelligent transparency campaign, and recently updated guidance.

There’s a line in the musical Hamilton which goes:

“No one really knows…How the sausage gets made. We just assume that it happens, but no one else is in the room where it happens.”

It’s about a meeting between three of the founding fathers of the United States to agree a political compromise. But it also reminds me of OSR’s intelligent transparency campaign. 

When you purchase a packet of sausages you might well want to know how they were made. What are the ingredients and where are they from? Who made the sausage, and did they follow rigorous food and hygiene standards? The answers to these questions might be easy to find, but it could be difficult or even impossible. And the answers might matter a great deal – the sausages could contain allergens which mean they should not be eaten by some people. 

When you hear a number being quoted by a Minister on the radio or see a figure used in a government press release, you may well have similar questions. Where does that number come from? How was it calculated and who did the calculation? Are there any warnings which should come with the number? Like with the sausages, the answers to your questions could matter a great deal. They could impact a decision you are going to make based on that number, like whether the bathing water quality at your local beach is safe for swimming today, where you will send your child to school, or who you are going to vote for at the next election.  

At OSR, we believe that you shouldn’t have to be in the room where it happens to have a good understanding of, and confidence in, the numbers used in the public domain by government. Government should make it easy for people to understand and scrutinise the data it uses to make decisions and inform the public. This is what is at the heart of our intelligent transparency campaign. 

To achieve intelligent transparency, government must: 


1. Ensure equality of access

Data used by government in the public domain should be made available to everyone in an accessible and timely way.   

2. Enhance understanding

Sources for figures should be cited and appropriate explanations of context, including strengths and limitations, should be communicated clearly alongside figures. 

3. Enable independent decision making and leadership

Decisions about the publication of statistics and data, such as content and timing, should be independent of political influence and policy processes. 

Our guidance on intelligent transparency, which was updated today, provides practical advice on how to implement these three principles. It highlights the role that all those working in government play in achieving this and now includes the following list of questions which you can ask yourself if you are using statistics and data publicly:  

  • Is the source for the figure in the public domain?
  • Are there limitations or caveats which impact how the figure should be used?
  • Is there context about the figure which impacts its interpretation?
  • Could this figure be misinterpreted or misused if taken out of context?
  • Would further support to ensure intelligent transparency is achieved be helpful?

Whether you are a producer or user of statistics, we would love to hear from you. You can get in touch with us for further advice and guidance, or if you are interested in working with us on our intelligent transparency campaign, via You can also keep up to date with all our work via our newsletter. Finally, if you are concerned about a lack of transparency in government use of data and statistics, you can raise a concern with us. 

“Welp. We screwed up”: A lesson in admitting your data mistakes

A couple of months ago this tweet from the Governor of Utah caught my eye:  

The background was that the Utah Department of Health had identified an error in its published vaccinations figures – fewer people had been vaccinated against coronavirus than had been reported. In a public letter to his fellow Utahns, Governor Cox admitted the mistake.  

Here at the Office for Statistics Regulation we love this kind of openness and action. Five things stood out to us from Governor Cox’s letter, which we think all statistics producers in the UK should be thinking about when it comes to data errors. 

  1. Identify your mistake

You can’t fix what you don’t know is broken. This is why rigorous quality assurance is so important. Statisticians need to regularly look under the hood of their analysis to assure themselves and others that it is correct. In Utah, a healthy dose of scepticism about an unexpectedly high vaccination rate made the data team double-, triple- and quadruple-check their figures, until they found the mistake. So, as a statistics producer ask yourself: how do I assure myself that my analysis is correct? Do the numbers look as I expect, and why or why not? What are the risk points for errors? If the root cause of an error isn’t obvious then it can help to ask the five whys until you reach it. 

  1. Be open

One of the things that impressed us most about this example was how direct and open the communication of the error was. There was clear ownership of the mistake and a willingness to correct the figures quickly, publicly and with humility. In the UK, our Code of Practice for Statistics requires government statistics producers to handle corrections transparently. It’s also important that government officials and minsters who use statistics are open about mistakes. 

  1. Fix it

Of course, once you have identified the mistake, it needs to be fixed. As well as being transparent about corrections, the Code of Practice asks that they are made as quickly as is practical. 

  1.  Explain

In Utah, Governor Cox explained that while they had reported that 70% of adults had received at least one dose of a coronavirus vaccine, the actual figure was 67%. In a separate statement, the Utah Department of Health went into more detail about the numbers and the mistake. Statistics and data should help people understand the world around them. So, when admitting a data error, it’s important to clearly explain the impact of it – what has changed and what does this mean? 

  1. Improve

The last, but perhaps the most important, step is to learn from the mistake – so that you can avoid it, or something similar, happening again. In Utah, the data team re-examined their processes to prevent this general type of error from being repeated. Statistics producers should reflect on why a mistake was made and what can be done to avoid it in future – and then share what they have learned, and what action they are taking, with their users. 

Statistics and data produced by governments serve a vital purpose. They keep us informed, help us make decisions and allow us to hold our governments to account – so we must have confidence in them and the people who produce them. As Governor Cox said, “trust consists of two things: competence and ethical behaviours”. We agree. The Code of Practice places a strong emphasis on trustworthiness. We see that trustworthiness is demonstrated by organisations which are open, and where people who work on statistics are truthful, impartial, and appropriately skilled. We are all human, we mess up and we make mistakes – but we can retain trust by actively looking for our mistakes, being open when we find them and by learning from them.  

Road Safety Statistics in Scotland, Wales and Northern Ireland: 2020 and Beyond

Being on the roads is something that most of us do every day, whether as a driver, passenger, cyclist or pedestrian. Statistics about road safety are used to inform road improvements, road safety laws and local initiatives so it’s important that they are accurate and of maximum value to users.

Today we released a series of letters about our review of road safety statistics produced by Transport Scotland, the Welsh Government, Police Service of Northern Ireland and the Department for Infrastructure Northern Ireland. We are very pleased to confirm that all four sets of statistics will continue to be badged as National Statistics. This means that they meet the highest standards of trustworthiness, quality and value. Although some of our findings and recommendations are specific to the statistics produced by individual teams, we also came across the following common themes.

Uses and Users of Road Safety Statistics

All the statistics include historic trends, so that users can see how the number of accidents has changed over time. Insightful commentary provides context for these changes – for example, when a new road safety law has come into effect. These features of the statistics allow police forces and local authorities to implement and monitor road safety initiatives in their local areas. They also help policymakers assess the progress made towards government targets.

All the statistics producers have strong relationships with police forces and policy colleagues, but the understanding of and engagement with users beyond these groups varied. We saw some great examples of user engagement resulting in improvements to the statistics. In other cases, we think that better user engagement would help to identify unknown and potential users. Although reaching these users can be challenging, it is vital in ensuring that the statistics can achieve their potential to serve the public good.

The Power of Collaboration

The data recording system historically used by police to record road accidents (called STATS19) is currently being reviewed. The review, led by the Department for Transport, will recommend changes and improvements to what information is collected and how it is recorded. To make sure that issues that concern Scotland, Wales and Northern Ireland are heard and acted on during this and any future review, we recommend that all four teams establish a stronger working partnership.

More engagement between the teams will also present the opportunity to learn from each other about how they produce and present statistics on the same topic. We saw lots of areas of good practice which, if shared, would benefit the other teams to learn from: for example, the thorough checking process at the Police Service of Northern Ireland to ensure that no collisions are missed; the user engagement project carried out by Transport Scotland last year; the way that Department for Infrastructure analyses and presents under-reporting of road accidents; or the recent development of an interactive dashboard by the Welsh Government.

Communicating Change

Some police forces in the UK, including Police Scotland and about half the forces in England, have recently moved to a new data recording system – the Department for Transport Collision Recording and Sharing (CRASH) system. One of the main advantages of this new system is the improved accuracy of how severe an injury is – the police officer records the most severe injury, and this is automatically labelled either ‘slight’ or ‘serious’. This is different from the approach in police forces using other systems to collect STATS19 data, where officers use their own judgement to determine how severe an injury is. Statisticians themselves don’t always have control over the adoption of new collection approaches, but we encourage all producers to think about the advantages and disadvantages of their current systems to make sure that the data and methods used are the most appropriate and the best quality for the job.

It is important that before, during and after periods of change, producers of statistics keep their users well informed. This allows users to understand what impact any changes will have on how they use the statistics and provides the opportunity for users to contribute their views. Whether the change is the introduction of a new recording system, or the updating of the current system following the STATS19 review, we encourage all producers to proactively publish as much information as possible and to think about how they open up communication with their users.

The Future of Road Safety Statistics

Road safety policies in Scotland, Wales and Northern Ireland aim to reduce the number of people killed on the roads, with each devolved government having set targets for 2020. New road safety strategies looking beyond 2020 will soon be developed, highlighting how important it is for producers to continue their strong relationships with policy colleagues. The progress towards the current targets varies across the nations and next year each set of statistics will report on the final data for 2020. These statistics will continue to be fundamental in monitoring success against current and future government policies – whether by those who are responsible for our safety, by advocates for road safety improvements, or by us as individuals who interact with the roads in our daily lives.