Shining the spotlight on the quality of economic statistics

In our latest blog, Statistics Regulator Emily Carless discusses the importance of quality in economic statistics…

It’s an exciting time to be a regulator of economic statistics in the UK. The economic statistics landscape is changing, with more new and innovative data available than ever before. The challenges presented by increases in the cost of living have also put greater attention on economic data.  The regulatory landscape has also changed: the UK’s departure from the EU means the European statistical office will no longer have a role in verifying the quality of UK statistics. This has created an opportunity for us, as the independent regulator of official statistics in the UK, to develop a new programme to provide continued assurance to users on the quality of economic statistics.

Since March this year I have been leading this programme, which we are calling the ‘Spotlight on Quality: Assuring Confidence in Economic Statistics’.  As part of this programme, we are carrying out a series of quality-focused assessments. To help us do this, we are developing a quality assessment framework. We started with the Quality pillar of our Code of Practice for Statistics, and have adapted it by taking relevant practices from other international frameworks to enable a deeper dive on the many elements of quality in economic statistics.

Earlier this summer we published the report of the first pilot assessment on Producer Price Inflation statistics, which are produced by the Office for National Statistics (ONS). We gathered feedback on the assessment and the programme more generally to get insight into how we can maximise the benefits of the programme. My colleague Job led the first pilot assessment and has this to say:

“ONS’s Producer Price Indices (PPIs) were an excellent candidate for the first quality-focused assessment. The new quality assessment framework that we developed helped us investigate all aspects of quality in greater detail than we normally would in an assessment. We’re continuing to refine the framework to make it an even more effective tool. One thing we trialled with this assessment, which worked particularly well, is speaking to the international statistical community about the UK’s statistics. Conversations with price statistics experts in other National Statistical Institutes gave us an insight into the extent to which ONS is following international best practice in producing PPIs. We’ll look to replicate this approach in other quality-focused assessments.”

We received positive feedback about our quality focused assessments from the experts we spoke to in other NSIs. Espen Kristiansen of Statistics Norway commented on the benefits to the international community of our programme:

“It is very valuable to have an external review of statistics. It reminds us how we can frame a discussion about quality, which for statistics is a surprisingly complex and many-faceted concept. Not only can the review detect flaws, it also makes us better at continuously improving our statistics in the future.”

Users of economic statistics in the UK have also found our quality-focused assessment on the PPIs useful. Cheryl Blake from the Ministry of Defence said:

“We welcome the findings of the OSR report on the quality of the PPIs and the opportunity to provide user feedback, particularly the recommendations to publish weights and standard errors which will greatly help our team better understand the underlying drivers of the indices and their quality. This is of growing importance as there is growing interest across Government in using industry-specific PPIs for indexing contracts to better track prices and provide a fairer return to industry. Further information on sample size and resultant quality will help inform index selection and we hope to work with ONS regarding the rationalising on index publication to ensure the provision of critical indices for indexing Government contracts.”

In carrying out these assessments we aim to be supportive of the producers of economic statistics, to champion where they have improved the quality of their statistics as well as to challenge where further improvement is need. Chris Jenkins, Assistant Deputy Director for Prices division in ONS, highlighted the benefits of these assessments to the producer team:

“The regular churn of our monthly statistics sometimes means we don’t get the opportunity to review the methods and metadata that supports our production process as frequently as we would like. This assessment provided us with the perfect opportunity to take stock of what we do, and the constructive support from OSR highlighted areas where we are doing really well, but also areas where we need to make quality improvements. Having the assessment now gives us a clear set of actions we can take to improve the quality of PPI, which is a key economic indicator.”

The outcomes from our quality-focused assessments also provide useful information on where the quality of other statistics could be improved. We will work with statistics producers and the Statistical Heads of Professions to share good practice and shine a light on where the quality of statistics more broadly can be improved. Rachel Skentelbery, ONS Deputy Head of Profession for Statistics, said:

“We welcome OSR’s new quality-focused assessment programme which complements work we are leading to support ONS colleagues in assessing and improving the quality of statistics. Many of the findings identified in these assessments will be relevant beyond the specific output being reviewed, for example in areas such as RAP [reproducible analytical pipelines], quality assurance, user needs and transparency, and we look forward to working together to share lessons and promote best practice to drive improvements to quality across the organisation.”

Over the next few months we will be publishing the report of our second pilot quality-focused assessment on the Profitability of UK companies and Gross Operating Surplus of private non-financial corporations, and publishing the quality assessment framework that we have developed for use in these assessments.

 

If you are interested in learning more about this programme or have any feedback on our first report then please get in touch by emailing regulation@statistics.gov.uk.

How do people use official statistics to make decisions?

Sofi Nickson, Head of Research at OSR, shares why OSR is interested in the role statistics play in decision making by members of the public along with what we know about this so far, and invites others to share evidence they have on the topic.

When I first heard about the Office for Statistics Regulation (OSR), I assumed it simply checked whether statistics are accurate or not. It wasn’t until last year, when I happened upon a job advert for OSR that I looked into what it really does. It turned out that I was somewhat off the mark in my assumption – OSR’s vision is not as it happens limited to ‘accurate statistics’, but it is the far more inspiring ‘statistics that serve the public good’. For the past few years, colleagues across OSR have deepened their understanding of what this may look like through their regulatory work and supporting research programme, which I am now lucky enough to lead. Part of the research programme is understanding the role official statistics can play in decision making by members of the public and, in this blog post, I explain why we are interested in this and what we know so far, then I invite you to share your thoughts on the topic.

Statistics serving the public good

The Statistics and Registration Service Act states that serving the public good includes assisting in development and evaluation of public policy, and informing the public about social and economic matters. ‘The public’ here could be anyone outside of government, in fact a report from workshops on whether scientists understand the public states that ‘there’s a thousand publics out there that one could address, any of whom has to be understood… in order to know how to deal with them, how to work with them, engage them, try to benefit them’. We have begun understanding how some publics play a role in statistics serving the public good, such as non-governmental organisations that may use statistics to provide services, businesses who can use them to adapt their practices and better meet needs, and we have evidence from analysing applications to access public data suggesting that academics see themselves holding a role in providing an evidence base for decision making. Even the media plays a part, with ESCoE research finding that journalists help translate statistics for public consumption.

The point here is that for statistics to serve the public good, they must be a tool both for government and for those beyond. When we look at statistics use outside of government, we have heard from a wide range of civil society organisations through our regulatory activities about their uses of statistics on behalf of the public. This is a way for statistics to serve the wider public indirectly, without individuals using statistics themselves. However, we are currently missing an important piece of the puzzle – what use looks like for individual citizens outside of such organisations. This family or individual-level use of statistics is far less visible, but may be no less valuable in serving the public good. In OSR we want to shine light upon these hidden uses of statistics by exploring how individual citizens use statistics in their professional and personal lives, and what they value in statistics.

The more we can understand, the better we can ensure our regulatory decisions and recommendations support members of the public statistics users. This topic is vast, so to narrow it down we are starting with how individual citizens use official statistics to make decisions, and we are focussing on three areas:

  1. Whether members of the public find value in using statistics to make decisions (including whether they do use statistics at all)
  2. Whether members of the public feel equipped to use statistics to make decisions in the way they are currently presented
  3. How statistics inform decisions (including how much influence they have alongside other factors).

Do members of the public find value in using statistics to make decisions (and do they use statistics at all?)

A quick search about using statistics to make decisions gives lots of potential examples. For example, using statistics on school performance to inform where you want to educate your child, or using statistics on crime to decide where to live. But we currently lack evidence about whether and how these potential uses play out in real life – do people actually use statistics like this, or do we just think they could. Even more specifically, do people use official statistics, or are their information needs being met by other sources.

In the Public Confidence in Official Statistics 2021 survey, just over half of respondents (52%) agreed to some degree that statistics helped them to make a decision about their lives. However, we don’t know from this what sort of statistics or decisions respondents were thinking about. We also don’t know what might support the other 48% of respondents get value from using statistics as well, or even whether this 48% want to use statistics at all. It may be that individuals are already satisfied with organisations in civil society using statistics on their behalf.

Do members of the public feel equipped to use statistics to make decisions in the way they are currently presented?

From our commissioned research into statistical literacy we saw great variability amongst the general public in skills linked to statistical literacy, and have concluded that responding to this is all about the communication. We have evidence on how to communicate statistics to non-specialists, for example recommendations from a programme of work by ESCoE which explores communicating economic statistics to the public. Despite strong recommendations, there is more that could be done to improve the communication of statistics to non-expert users, which is why in our business plan for 2023/24 we commit to championing the effective communication of statistics to support society’s key information needs. We don’t profess to know everything in this area though and are always interested in learning more.

How do statistics inform decisions (and how much influence do they have among other factors)?

We have uncovered an abundance of literature about human decision making, including how heuristics and biases sit alongside ‘rational’ evidence-based choices. From this we recognise that it is unlikely anyone bases their decisions on statistics alone, but we still don’t know how influential official statistics are and where they sit alongside other evidence. Are they seen as compelling and trustworthy? What factors influence this?

Can you help?

If you have read this far it will be clear that we have a lot of questions about how statistics can serve the public good. In OSR, asking questions is in our nature – as a regulator our judgements and decisions are informed by the evidence we have, so we are always seeking to learn more. If you know of any research, examples, or information that you think could inform our understanding of the role official statistics play in how members of the public make decisions then we would love to hear from you – please get in touch with us at research.function@statistics.gov.uk

Intelligent transparency – or how the sausage gets made

In our latest blog, Regulator Anna Price discusses our intelligent transparency campaign, and recently updated guidance.

There’s a line in the musical Hamilton which goes:

“No one really knows…How the sausage gets made. We just assume that it happens, but no one else is in the room where it happens.”

It’s about a meeting between three of the founding fathers of the United States to agree a political compromise. But it also reminds me of OSR’s intelligent transparency campaign. 

When you purchase a packet of sausages you might well want to know how they were made. What are the ingredients and where are they from? Who made the sausage, and did they follow rigorous food and hygiene standards? The answers to these questions might be easy to find, but it could be difficult or even impossible. And the answers might matter a great deal – the sausages could contain allergens which mean they should not be eaten by some people. 

When you hear a number being quoted by a Minister on the radio or see a figure used in a government press release, you may well have similar questions. Where does that number come from? How was it calculated and who did the calculation? Are there any warnings which should come with the number? Like with the sausages, the answers to your questions could matter a great deal. They could impact a decision you are going to make based on that number, like whether the bathing water quality at your local beach is safe for swimming today, where you will send your child to school, or who you are going to vote for at the next election.  

At OSR, we believe that you shouldn’t have to be in the room where it happens to have a good understanding of, and confidence in, the numbers used in the public domain by government. Government should make it easy for people to understand and scrutinise the data it uses to make decisions and inform the public. This is what is at the heart of our intelligent transparency campaign. 

To achieve intelligent transparency, government must: 

 

1. Ensure equality of access

Data used by government in the public domain should be made available to everyone in an accessible and timely way.   

2. Enhance understanding

Sources for figures should be cited and appropriate explanations of context, including strengths and limitations, should be communicated clearly alongside figures. 

3. Enable independent decision making and leadership

Decisions about the publication of statistics and data, such as content and timing, should be independent of political influence and policy processes. 

Our guidance on intelligent transparency, which was updated today, provides practical advice on how to implement these three principles. It highlights the role that all those working in government play in achieving this and now includes the following list of questions which you can ask yourself if you are using statistics and data publicly:  

  • Is the source for the figure in the public domain?
  • Are there limitations or caveats which impact how the figure should be used?
  • Is there context about the figure which impacts its interpretation?
  • Could this figure be misinterpreted or misused if taken out of context?
  • Would further support to ensure intelligent transparency is achieved be helpful?

Whether you are a producer or user of statistics, we would love to hear from you. You can get in touch with us for further advice and guidance, or if you are interested in working with us on our intelligent transparency campaign, via regulation@statistics.gov.uk. You can also keep up to date with all our work via our newsletter. Finally, if you are concerned about a lack of transparency in government use of data and statistics, you can raise a concern with us. 

Why misleading statistics should never become a catchphrase

In our latest blog, Head of Casework Elise Rohan, talks about the problem with the repeated use of misleading statistics and how you can combat this.

I have always been a fan of the show Catchphrase. The joy of being able to recall expressions and idioms without always understanding what they mean or where they come from. They are just phrases I have heard repeated elsewhere.

Our ability to remember things we have heard repeated isn’t limited to words. Have you ever found yourself quoting a statistic and struggling to remember where you first heard it? Such as; two-thirds of lottery winners end up broke or that half of all marriages end in divorce.

As our world becomes more abundant with data, statistics are increasingly used to persuade and provoke discussion – from daytime television to debates in Parliament. In many cases, statistics are seen as a tool to strengthen weak arguments.

At OSR, our vision is that statistics should serve the public good. A big part of that is encouraging their use in wider debate, but it also involves combating and safeguarding against misleading statistics.

What do we mean by misleading statistics?

Misleading statistics are those that misrepresent data either intentionally or not. We have developed a definition of misleadingness in the context of our work as statistics regulator which is:

“We are concerned when, on a question of significant public interest, the way statistics are used is likely to leave a reasonable person believing something which the full statistical evidence would not support”

Repetition of incorrect or unsupported statistics has the potential to harm our vision and public confidence in statistics. The repeated use of misleading statistics creates a validity through reuse. This is known as the ‘illusory truth effect’ or repetition bias. The more you say something, the more confident you become at saying it. Research on this phenomenon has found that we have a cognitive bias to perceive confidence and fluency as characteristics of truthfulness. I’m sure we can all think of public figures who have been accused spreading ‘fake news’ in this way.

We also see this type of misleadingness in the types of casework we receive in OSR. For example, we recently commented on the repeated use of an unsupported claim concerning sex-based differences in online harassment in the Houses of Parliament. And of course one of our most high-profile interventions concerned repeated claims made about the UK’s contribution to the European Union.

 

So, what can you do to combat this?

  • Develop the skills to critically challenge what you see.
    • Is a source provided for the figure? If so, is the source reliable?
    • Can you access the underlying information to check and understand the figures for yourself?
    • Is the figure presented in the right context? Is it clear why the time frame has been chosen or why any comparisons have been made?
  • The accuracy of claims is often nuanced rather than a binary true or false – as explained in Tim Harford’s guide to statistics in a misleading age. The House of Commons Library has published guidance on How to spot spin and inappropriate use of statistics.
  • If you see a statistic that feels questionable, try and fact-check it when you first hear it to reduce the influence of the illusory truth effect. Make use of search engines or organisations such as Full Fact to see if the claim has already been checked and commented on.
  • If you’re using statistics to make a claim or support an argument, make sure you help people understand what you’re saying and prevent misinterpretation by following our principles for intelligent transparency. How easily can someone verify what you have said? Is the context for your claim and any limitations clear?
  • Finally, if you see misleading statistics being repeated, get in touch with us. Every year, we receive hundreds of queries, many of which are about misleading statistics. In 2022/23, we dealt with 372 cases, in line with our interventions policy.

One of OSR’s priorities for 2023/24 is to champion the effective communication of statistics to support society’s key information needs. As part of our work to deliver this aim, we are reviewing our existing guidance to understand what more we can do to support the statistical system to use a range of communication methods while preventing and combating misuse.

Guest Blog: Improving ethnicity data quality in the public sector

Introduction

I am the Chief Statistician in the Cabinet Office. I also lead the Equality Data and Analysis Division in the Cabinet Office’s Equality Hub.

One of my roles is to improve the quality of ethnicity data across government departments.

The Standards for Ethnicity Data

In April 2023, my team in the Equality Hub published a set of Standards for Ethnicity Data.

This followed a consultation on a draft set of standards last summer. We also published an analysis of the consultation responses.

We committed to publish the standards in response to action 6 of the Inclusive Britain report — the government’s comprehensive response to the Commission on Race and Ethnic Disparities:

“To ensure more responsible and accurate reporting on race and ethnicity, the Equality Hub will, by the end of 2022, consult on new standards for government departments and other public bodies on how to record, understand and communicate ethnicity data.”

The standards describe best practice for the:

  • collection of ethnicity data
  • analysis of ethnicity data
  • reporting of ethnicity data

Noteworthy aspects of the standards

Elsewhere I have described 5 noteworthy aspects of the standards:

  • They are topic-specific data standards
  • They reflect the Code of Practice for Statistics
  • They also relate to the different stages of research
  • The standards apply to government departments and public bodies
  • We want to understand the use and impact of the standards

Important areas of data quality

I know there is much good practice in departments and other organisations that produce ethnicity data. But there are areas where the standards can have a big impact on the quality of ethnicity data. Four areas in the standards that are important are:

1. Not using the phrase ‘BAME’ (or ‘BME’) in outputs

These phrases emphasise some groups and exclude others, such as white minority groups and mixed ethnicity groups. The standards talk about the importance of using the correct language – the Equality Hub also provides advice on how to write about ethnic groups.

2. Using harmonised standards for ethnicity data as much as possible

The Government Statistical Service team in the Office for National Statistics leads cross-government work on developing and maintaining harmonised standards. Using harmonised standards helps improve the coherence and utility of public sector data. We also encourage the use of as many detailed ethnic groups as possible in outputs.

3. Understanding the level of missing ethnicity data in data collections

This is an important indicator of data quality. Some datasets have high levels of missing ethnicity. For example 19% of prison officers had unknown ethnicity in 2020. Also, the percentage of prison officers with unknown ethnicity changes every year. For example, it was 9.8% in 2015, and 30.1% in 2019. This makes it difficult to make reliable generalisations about changes over time, and has a big impact on how the data can be used and understood. Reporting on the level helps users interpret ethnicity data better.

4. Analysing and reporting on ethnicity data by controlling for other demographic factors.

The ONS has done some sophisticated work in this area. We understand that undertaking regression analysis is not always going to be possible. But users can be particularly interested in deeper analysis and clearer context of the data. Understanding the impact of factors other than ethnicity can be helpful for them. For example, could differences be due to where people in some ethnic groups live? Could they be due to differences in age structure, or an imbalance in the number of men and women in a survey sample?

This sort of contextual analysis is something the Equality Hub is starting to do for the Ethnicity facts and figures website as part of action 8 in the Inclusive Britain report.

Promoting and encouraging use

I am now at the stage of:

  • promoting the standards
  • encouraging their use
  • thinking about how to understand their impact

This will help the Equality Hub meet our aim of ensuring more responsible reporting of ethnicity data.

I have emailed statistical Heads of Profession in government departments about the standards. I have been invited to speak to analysts in some departments, and I would like to do so in other departments.

OSR is helping us with understanding the use and impact of the standards. For example, they will:

  • develop guidance for reviewing data producer compliance with these standards when they carry out assessments.
  • review different data producers’ statistics in one or two years’ time to see how producers are responding to the standards.

If you’re in a public sector organisation and would like to discuss the standards, please contact me at richard.laux@cabinetoffice.gov.uk.

Remember how far you’ve come, not just how far you want to go

In our latest blog, Director General Ed Humpherson reflects on what OSR has achieved and our plans for the future…

At OSR, we are constantly striving to improve. We recently published our Business Plan for 23/24 which outlines our ambitions to support the transformation of statistics and support improved communication of statistics, and to build partnerships with other organisations that focus on the public value of statistics and data.

But as the saying goes, it’s important to acknowledge how far we’ve come, as well as where we want to be. So I’d like to begin by setting out what we do.

The Office for Statistics Regulation is the regulatory arm of the UK Statistics Authority. Our role is to promote and safeguard the production and publication of official statistics in the UK.

There are three elements to our role:

  • how statistics are produced – conducting reviews on the statistics produced by government bodies, including awarding the National Statistics designation;
  • how statistics are used – looking at how statistics are used in public debate; and
  • how statistics are valued – promoting the value of statistics.

We do all this underpinned by a coherent philosophy. For statistics to serve the public good, it is essential to recognise that they are not just numbers. They must be produced by trustworthy organisations. They must be high quality. And they must be valuable to users.

Our maturity model

 

Levels of regulatory maturity

When looking at where we are heading we use a regulatory maturity model.

At the lower end the regulator is operating at a static level, checking the compliance of individual statistics. This would be worthwhile, but wouldn’t be at all responsive to emerging user concerns.

At the next level of maturity, the regulatory becomes more dynamic, responding to emerging issues from users. But it would be piecemeal.

To mature beyond this, and reach the next level, the regulator should mature to become systemic – thinking about how to foster and support a system of statistics that is responsive to users.

The highest level of maturity model goes beyond the statistical system and recognises that for statistics to serve the public good requires a whole culture of respect for evidence, data and analysis.

Where we are now

How mature are we, assessed against this model? We are certainly delivering our core regulatory programme – which shows we are meeting the basic level of maturity – doing a wide range of assessments and other compliance reports during the year.

We are also responsive to emerging user concerns – for example, about ONS’s excess deaths statistics; or the exam algorithms in 2020; or about the population estimates, a set of concerns that first arose around the city of Coventry.

But this is something we do only partially. In my view there is a way to go to be better at anticipating these sorts of user concerns, and being more skilled at doing deep dives into specific issues that are raising questions.

We are also increasingly systemic – addressing the ability of the system to meet user needs more widely; for example through our state of the statistics system report and through our campaign on intelligent transparency. And some of this gets into the wider space of a public culture of data and evidence use, for example our work on statistical literacy. We really should develop this further: it’s hugely important.

What people tell us

We continually ask for feedback, and as part of the recent UKSA mid-term strategy review, users completed a survey including questions about OSR. Stakeholders told us that:

– we should continue to do deep dives, but only if they’re done properly. The recent Sturgis review of our work (link) shows where we can improve in this regard.

– we should continue to challenge poor practice.

– we should increase our visibility, and champion effective communication – things we need to more of.

– we should build partnerships with other organisations.

These all point to us needing to focus at the upper end of the maturity range – to be systemic and outwards focused.

OSR: Work in progress

So what does this all mean in terms of our development as a regulator?

In short, OSR has come a long way. But we are not at the top level of maturity yet. There is a lot we need to improve on – and that’s the intention of our business plan.

We’re keen to hear your views about our work and priorities. Please get in contact with us at regulation@statistics.gov.uk to let us know your thoughts, or if you would like a wider discussion with us.

Guest blog: The Algorithmic Transparency Recording Standard

In our latest guest blog, Joy Aston and Mark Durkee, from Centre for Data Ethics & Innovation (CDEI, part of the newly-formed Department for Science, Innovation & Technology), discuss the Algorithmic Transparency Recording Standard (ATRS), which they have developed with Central Digital and Data Office (CDDO) following commitments in the National Data Strategy, and reinforced by findings from the OSR’s report on public confidence in statistical models published in 2021.

Ensuring that statistics are used responsibly has long been a key part of the remit of statisticians working within government. The approach to doing this has matured over the years, including via the introduction of the Office for Statistics Regulation (OSR), and the government’s statistics profession has a strong track record in delivering trustworthy official statistics.

However, in recent years, statistical models are finding new uses, often described as algorithms, AI or algorithmic tools. To help non-experts understand this work, we’re using ‘algorithmic tool’ as a deliberately broad term that covers different applications of AI and complex algorithms.

Used well, such algorithmic tools can offer significant benefits; saving time, supporting more effective decisions and risk assessments and more. The increased use of complex statistical models in more operational ways has important implications for the role of statisticians. A badly designed, or badly understood, statistical model might not only have implications for broad policy-making, but on specific decisions made about individuals. A growing field of data ethics has emerged to help manage some of these risks, and ensure that such models are developed and used responsibly.

The public has a right to explanation and information about how the government operates and makes decisions about them. This is to ensure people can understand actions taken, appeal decisions, and hold responsible decision-makers to account. 

Perhaps the most high profile example of the need to secure public confidence in algorithms, were the approaches taken by the UK exam regulators in awarding exam grades in 2020. The subsequent OSR report on public confidence in statistical models published in 2021 highlighted transparency as key to ensuring public confidence.

Though public confidence is a core reason for transparency, there are other benefits too:

  • Transparency can support innovation in organisations, helping senior leaders to engage with how their teams are using AI and enabling the sharing of best practice between organisations.
  • Transparency can help to improve engagement with the public, and facilitate greater accountability: enabling citizens to understand or, if necessary, challenge a decision.
  • Transparency tends to make things better; when we are open about things we tend to do them better, for example by paying more attention to broader data ethics issues, or in clearly articulating the limitations of a tool.

The OSR report recommended the creation of a comprehensive directory of guidance for Government bodies that are deploying algorithmic tools. Many other organisations have made similar calls, and commitments were made in both the National Data Strategy (2020) and National AI Strategy (2021). In response, the Centre for Data Ethics & Innovation (CDEI) and Central Digital and Data Office (CDDO) have worked together to develop the Algorithmic Transparency Recording Standard (ATRS)

This establishes a standardised way for public sector organisations to proactively and transparently publish information about how and why they are using algorithmic approaches in decision-making. This information is published on gov.uk to make it easily accessible to the public. It has been endorsed by the UK government Data Standards Authority which recommends the standards, guidance and other resources government departments should follow when working on data projects. 

The design and development of the Standard has been underpinned by extensive collaboration with public sector, industry and academic stakeholders as well as citizen engagement. It is informed by a public engagement study run by the CDEI and BritainThinks and has been piloted with a variety of public sector organisations across the UK, including the Information Commissioner’s Office, the Food Standards Agency, police forces and more. Pilot partners have noted several benefits internally, including how it encourages different parts of an organisation to work together and consider ethical aspects from a range of perspectives.

The use of the term algorithmic tool is partly driven from this public engagement (as a term more readily understood by the general public than alternatives such as statistical model), but also seeks to emphasise the standard’s focus on how tools are embedded in operational processes. The understanding of how the underlying statistical model(s) were created and tested is an important part of this, but so is a clear understanding of how it is being used in practice, for example by front-line operational staff.

We have developed, published, and piloted the Standard, but there is still more work we want to do to support and embed its use. We are currently working on developing an online repository to host records in a searchable and accessible way, along with continuing to encourage its uptake. In parallel, we are actively working directly with teams to complete records, offering support, feedback and guidance.

We would encourage anyone working with, or developing, tools that they feel are in scope of the Standard to get in touch with us at algorithmic.transparency@cdei.gov.uk.

Why does analytical leadership matter? What we’ve found so far 

In our latest blog, Regulator Nicky Pearce discusses our work on analytical leadership so far and the next steps.

Why are we looking into analytical leadership? 

In OSR, we believe that strong analysis can inform decisions that help improve the lives of citizens. We saw first-hand how data and analysis entered the spotlight during the COVID-19 pandemic and the benefits that came from this, including cross-department collaboration, producing analysis quickly to respond to emerging issues and using analysis to inform a wide range of policy and operational decisions. We would like to see analysis stay in that spotlight by applying learning from these successes to other key societal issues, including, for example, on climate change, levelling up and increasing living costs.  

I joined the OSR in February last year and soon after that I became part of the Analytical Leadership review team who are looking to explore ways in which analytical leadership might be strengthened across Government. Having previously worked on multiple different statistical outputs across the Office for National Statistics and the Office for Students, I was very interested to see how the OSR’s previous work on statistical leadership would extend to analytical leadership. 

You may be asking yourself “But you regulate statistics, why are you looking at analysis?” This is a fair question and one that I asked myself when I first joined the team. The answer is quite simple. Although within government we differentiate between statistics, data and analysis, to the outside user there is often no difference. Statistics and analysis are intertwined; strong statistics support strong analysis and both require effective communication to support decision makers as well as the public. The standards for government statistics are set out in our Code of Practice for Statistics, and we believe that the pillars of the Code (Trustworthiness, Quality, and Value) can be applied more widely, helping to support analysis in the way that it’s produced, used and valued, and to realise wider benefits for the public. In addition, these principles are a helpful framework for analytical leaders to draw on.  

 

What have we been up to? 

Since we announced the launch of this work last May, we have reached out to people across Government and beyond in each of the four nations. To date, we have spoken to over 50 people across more than 30 government departments, including Directors, Heads of Profession, Economists, Social Researchers, and Policy colleagues. We have also spoken to several people outside of Government including academics. These conversations have centred around how analysis is produced, used and valued and it’s been fascinating to hear everyone’s views on this topic, especially where those views have challenged my own perception of what analysis in government looks like and the common barriers.  

We would like to thank everyone who has taken the time to share their thoughts with us, they have been instrumental to shaping this work.  

While it’s been great to hear so many diverse views, these conversations were not always without challenge. We very quickly realised that the language we were using around this work was very important, for example, people often had different interpretations of the term ‘analytical leadership’ and how it related to their work. Some people questioned our role in this area and others were concerned about how our work linked with other activities being carried out in this space, including work by the Government Analysis Function. However, through discussion, we have been able to provide assurance that we are looking to support and champion analytical work across Government: indeed, we hope to draw on and highlight the work of others to do this more effectively.   

 

What are the emerging themes? 

Four high-level themes have really stood out to us, and highlight the importance of:  

  • Senior leadership and culture

Often, we think of leadership as being a top-down effect with senior leaders setting the standard, which is then followed by everyone else in their departments. Whilst it is true that senior leaders are incredibly important in setting the direction and the analytical culture of a department, the reality is much more complex than this. Firstly, senior analysts and senior non-analysts have separate roles to play in supporting a strong analysis culture. Secondly, good analytical leadership encompasses a wide range of things, including ensuring that analysts get a seat at the right tables and have channels to advocate their work. It also needs non-analysts at all levels, including ministers and permanent secretaries, to actively seek out analysis to inform new policy areas or key operational decisions from day one, and be as transparent as possible about the evidence that they are using.  

  • Effective Collaboration

Collaboration comes in many forms and occurs at all different levels, within and outside of government. We heard lots of great examples of analytical work that has been produced through strong collaboration and the benefits it can have. For example, four nations cross-working groups on particular topics which allow for information and best practice to be shared across the UK and for informed decisions to be made quickly when needed, especially during COVID-19. Other examples include collaboration with external research organisations such as Administrative Data Research UK, Research Development Scotland, and the Wales Institute of Social and Economic Research, Data and Methods. We also heard about some of the barriers that people feel prevent more collaboration from occurring, including resource constraints. But crucially, that huge gains can be made if these are overcome, in terms of more efficient and effective policy and decision making, and new ground-breaking insights or innovations.  

  • Capability and skills

To have good analysis it is important to get the right people with the right skills in the right job. To do this, it is key that analysts stay up to date with innovative technologies and have the time to develop new skills. It is also key that analysts can communicate the key messages of their work as well as relevant caveats effectively to non-analysts, so that analysis can effectively inform policy and decision making. It is also important that analysts have access to wider analytical networks so that they are supported in their current roles as well as their longer-term career. There is also a need for non-analyst senior leaders and policy colleagues to have a good understanding of the value that analysts can offer, so that they draw on them as effectively as possible. Senior non-analysts also have a vital role in championing the importance of good analytical evidence in government and investing in the future analytical capability of their departments.  

  • Governance and standards

It is important that there are clear standards for analysts in Government and that analysts have the access to the guidance that they need. We discussed the varying approaches to governance and the structure of analytical staff with each of the Devolved Administrations to understand their unique approaches. We have heard about lots of positive work being carried out by several different parts of Government in this area, including the Government Analysis Function, the Government Statistical Service and the ONS Data Science Campus. We have also been thinking about the role of OSR in supporting and championing good analysis too.  

This is just taster of what we have heard. Our future outputs will explore these themes in greater detail, to highlight the important role that strong analysis plays in informing government decisions and improving the lives of UK citizens. 

 

What are we doing next? 

Through our discussions we have heard numerous examples of good practice (almost too many to count!), as well as ideas and suggestions for possible areas for improvements. We will be getting back in contact with individual teams to discuss the finer details of the good practice examples and how we can help promote their work through case studies, blogs or other communications. Through sharing these case studies, we hope to enable other leaders, analysts and organisations to break down barriers to effective analytical leadership and to take positive action.  

Our current plan is to divide up our findings into several smaller reports based on high level themes that we have identified. The first of these will be based on the theme senior leadership and culture, in summer 2023.  

We will also use the insights we’ve gathered to plan targeted engagement with specific audiences, and to enhance our regulatory work and other upcoming reports, such as our 2022/23 State of the Statistical System report. 

 

Can I still be involved?

While we have come to the end of our formal engagement period, we are more than happy to hear from you if you have any thoughts on our work or have a great case study of analytical leadership in action that you want to share. Please get in touch via: regulation@statistics.gov.uk 

There’s more to statistical communication than avoiding truncated axes…

In our latest blog, DG for Regulation Ed Humpherson and Head of Casework, Elise Baseley talk about the importance of communication to make statistics accessible and meaningful.

There’s more to statistical communication than avoiding truncated axes. It’s about making statistics accessible, meaningful, perhaps even enjoyable, for people.

At the Office for Statistics Regulation, we’ve recently commented on data visualisations that are misleading, or potentially misleading. Our role in calling out these problems remains important. But we’ve recognised the need to go much deeper into what makes communication helpful and accessible.

Communication of statistics, focusing on accessibility, was a big theme of the recent UKSA strategy midpoint event that I participated in a couple of weeks ago. The event was great and had some brilliant speakers including: Laura Gilbert, head of No 10 Downing Street’s data science team; Ming Tang, chief data officer for NHS England; and Tim Harford, the journalist and broadcaster. In different ways, they all emphasised the need to make statistics relevant and accessible for people – both people who are users or potential users of statistics, and for people who want a career in statistics and data.

The event was summed up very neatly in Tim Harford’s advice to “think like a 14 year old”, reflecting on the success of the Australian Covid dashboard.’ Tim was highlighting that the Australian version of the Covid dashboard turned out to be developed by three 14-year-olds. Apparently, they outed themselves once they were vaccinated and appeared in the data. The three teenagers had been experimenting with displaying the data and it turned out to be a massive success as it was communicated so simply.

While the event gave me a renewed enthusiasm for the importance of statistical communication, I found it hard to escape the feeling that OSR could be doing more to drive improvement in this space. Statistics must be communicated in a way that is easy to understand. The teams who produce statistics should be willing to explain what they mean – and what they don’t mean: recognition of uncertainty is so important. And as the work of the Winton Centre at the University of Cambridge has shown, being honest about uncertainty doesn’t seem to damage trustworthiness at all.

As I said at the event, we can’t do it alone: it will require us to build partnerships with a wide range of other organisations who care about the public’s engagement with statistics.

Of course, we do a lot of work already on this. I’ve already mentioned our interventions on weaknesses in the use of visualisations. These are important issues and are recognised as such by communicators in Government. After we’ve intervened, the relevant teams have recognised the issues and put in place new practices to prevent their recurrence. We’re also working with the Analysis Function to reiterate good practice.

But we also know that these interventions are not enough. Instead of intervening when risky visualisations are produced, we need to be much more upstream: understanding what supports public understanding, how statistics should be communicated, and how to identify what matters to a wide range of public audiences.

This is where our review of statistical literacy comes in. We commissioned it because we get frustrated by the attitude that the problem with statistics is that the public doesn’t have the expertise to understand them. This always strikes us as being a bit exclusive, perhaps even arrogant. And maybe not even true – there is evidence from the pandemic that people are both interested in and well equipped to understand well-presented statistics. Instead, then, of focusing on weaknesses in other people’s statistical literacy, our review says producers should regard it is all a matter of communication.

So: this will be one of our key priorities for the year that’s coming. We know there’s more to supporting good statistical communication than stamping out truncated axes. As Tim Harford’s example of the Australian Covid dashboard demonstrates, sometimes we just need to approach the problem through the eyes of an inquiring citizen.

International Women’s Day 2023

To celebrate International Women’s Day 2023, we asked some of the OSR team to reflect on what International Women’s Day means to them, and their work in the world of statistics.

 

Marianthi Dunn – Principal Economist, Statistics Regulator, Economy and Business, Industry, Trade and International Development 

OSR Regulator Marianthi DunnFor me, International Women’s Day is a wonderful opportunity to celebrate the strength, influence and impact that women have on our economy and everyday society.  It is a great celebration of how far we have travelled down the path of equalities and of the road we will travel ahead.   

My role as an economist and regulator is to support producers of economic statistics to measure coherent, and reliable numbers to help policy-makers make well-informed decisions for the better good of our society. 

We have all felt the rapid and dynamic economic changes that have transformed the way we work, the way we consume, the way we access basic services like health, education, even the way we live our daily lives. As a regulator, it is a privilege to be working at the heart of all this change, supporting producers of economic statistics in maintaining trustworthiness, quality and value.   

I am also a mother; my role as a parent, is to be a responsible ancestor. I recognise how important these key statistics are in the decisions that we make for sustaining our “wealth” and “real value” for our future generations. 

This year’s IWD theme is DigitALL: Innovation and technology for gender equality. Finding new and innovative ways to collect and analyse information about how our economy is rapidly changing, particularly in light of new technology and innovation can be quite a challenge.  Digitalisation is still at its infancy, yet it is having a significant impact on core economic decisions and our daily lives.  Understanding evolving users’ requirements can be quite challenging throughout a time of economic uncertainty and rapid digital transformation.    

Working closely with users and producers of these economic statistics has never been more important, for embedding trust in the messages they are giving us about the public good. 

 

Kirsty Garratt – Casework and Head of Private office 

OSR Casework and Private Office Manager Kirsty GarrattTo me, IWD is an opportunity to celebrate the many successes women have around the world and reflect on the opportunities that are available for you should you wish to step outside of your comfort zone. Success and progress can be hard, and scary, but without pushing the boundaries and fighting for what you want in life we still wouldn’t be able to vote. 

I work as a Casework lead and I manage the Director General’s Private office. I make sure that our casework function (we look into concerns with how Official statistics are produced or used) runs as smoothly as possible and that our Director General, Ed Humpherson has what he needs to enable him to get to where he needs to go and make the decisions he needs to make. 

If you had asked me before I worked for OSR if statistics mattered I would probably have said yes, sure… but they don’t affect me. I didn’t realise how wrong I was until I joined OSR. I had never thought I was a “user” of statistics. In my previous roles I’ve used the latest RPI figures to help me manage a contract I was working on, or during the pandemic I was obviously interested in the Covid data for my area – but that was it. Now I am in my role I can see how statistics have an impact on every aspect of my life, and I find that mindblowing. I am so glad that I work for an organisation that fights for statistics that really do serve the public good. To know that there is an organisation that not only sets the standards with the Code of Practice for Statistics, but also holds producers and the government to account when they are produced and used is really important to me. 

 

Nicky Pearce – Statistics Regulator, Children, Education and Skills 

OSR Regulator Nicky PearceSometimes it can be difficult to make the time to step back and look at our role as women in our society, both locally and globally. International Women’s Day creates space for us to do this and is able to bring important issues to the forefront of our discussions. I believe that we can all benefit by lifting up and championing women and that International Women’s Day creates a strong platform for this.  

I recently started reading ‘Invisible women: exposing data bias in a world designed for men’ by Caroline Criado Perez and I was struck by the concept of a gender data gap. We are all familiar with gender gaps in other areas of our lives, most notably, the gender pay gap, but it wasn’t until I read this book that I realised how many gaps or ‘silences’ as Caroline refers to them, there are in the data we hold on women.  

Not only do these data gaps impact the decisions that are being made by women, or for women, but they also impact the algorithms and artificial learning that are based on these datasets. For example, an artificial intelligence program that is designed to diagnose a given disorder but has only been trained on a data set of symptoms experienced by men will not be as accurate in identifying the disorder in women. As noted by UN Women: “A global analysis of 133 AI systems across industries found that 44.2% demonstrate gender bias”.  

In order to make the digital world more inclusive and more equitable we have to be able to identify and work towards addressing these data gaps. In an ever-evolving world, we must ensure that the data we hold are as representative as possible so that the new technologies that are built on them are as well.  

Statistics are one of the hidden building blocks of the way we function as a society. They feed into our lives and the decisions we make, or are made for us, in more ways than we can ever imagine. Without good statistics we cannot make good decisions. Because of this, it is vital that organisations across Government and beyond, produce good quality statistics in a way that is transparent and maximises the public good that can come from them.   

 

Emma Harrison – Statistics Regulator, Population and Society & Children, Education and Skills  

For me, International Women’s Day it’s a way of celebrating the things that women have done and will do in the future. It brings women together and gives us the opportunity to discuss the challenges we face as a result of our gender, and also the positives. 

I work with social statistics and for me, it is important to be mindful that every data point is a piece of information about a person or population. That’s what I think the public good of statistics is: using data that comes from the public for the public.  

I love that my job is varied and current. When starting the week, I have a pretty good idea of what I’ll be doing, but there will always be something new. It might be a new piece of casework, an email from a statistics producer or an enquiry from a user of statistics. These are often linked with current affairs and I enjoy seeing parallels between my own work and the media coverage or Twitter.   

 I think we can make the digital world safer, more inclusive and more equitable through challenging disinformation. Through our casework processes, we often come into contact with statistics that are either misinterpreted or misrepresented. By challenging these accounts, we can create a safer digital space.   

I think we also have a role here in making the digital world more equitable. Acknowledging intersectionality is key to the success of International Women’s Day and advocating for the availability of data in areas such as ethnicity, sexuality and disability is vital.  

If you’d like to find out more about our work, sign up for our monthly newsletter or follow us on Twitter or LinkedIn for regular updates.