What 2023 means for OSR

In our latest blog, Director General Ed Humpherson reflects upon the past year and sets out OSR’s priorities for the coming year. We are inviting feedback on these in order to develop our final 2023/24 Business Plan. If you have views on what the key issues in statistics are, please email regulation@statistics.gov.uk 

As the days shorten and the end of the year looms, in OSR we start turning our attention to the next financial year, starting in April 2023: to what we will focus on, what we want to achieve, what our ambitions are.  

This is a reflective and deliberative process: we don’t finalise our business plan for the 2023-24 year until April itself. And it’s also a consultative process: we want to hear from stakeholders about where you think we should focus our attention. 

How we develop our priorities

There are a range of inputs into our thinking. We start from our five year strategic plan, which sets out broadly where we want the statistical system to be by 2025, and how we will help the system get there. We form a judgement as to how close the system is to achieving this vision, and what we should focus on to help support it. 

We also draw on our annual state of the statistical system report (most recently published in July) on what it is telling us about what positive things in statistics we want to nurture and what challenges we want to address. And we also take stock of what we’ve achieved over the last 12 months: whether, in effect, we’ve successfully delivered last year’s priorities. 

But there are two further aspects to our business plan that are crucial. First, we can’t do everything. We are a small organisation, and necessarily we have to make prioritisation choices about where we can most effectively support the public good of statistics. And second, we can’t plan for everything. We have to respond to those areas that are salient in public debate, where statistics are helping or hindering public understanding. And we can’t always predict very well what those issues might be. To take an obvious example, our planning for 2020-21, taking place around this time three years ago, did not anticipate that we’d spend most of 2020-21 looking at statistics and data related to the pandemic. 

To help us make these choices, and to be better at anticipating what might be just over the horizon, we would like the input, advice and creativity of our stakeholders: of people who care about statistics and data, and who want to help us be as effective as we can be. 

Our 23/24 priorities

Today I am pleased to share with you our draft priorities for 2023/24. These are deliberately high level. We have not chosen our menu, selected our ingredients, and already got the cake ready for the oven. We want to leave lots of room for people outside OSR to make suggestions and raise questions. 

Our plan for next year has four high level priorities, all focused on different aspects of supporting change and transformation: 

  • Support and challenge producers to innovate, collaborate and build resilience 
  • Champion the effective communication of statistics to support society’s key information needs 
  • Continue to widen our reach beyond official statistics and official statistics producers 
  • Increase our capability as a regulator 

The keen observers among you might note that these are an evolution of last year’s priorities, rather than a wholesale change. We have had a good year, for sure; but as always, we will always strive to do more.  

Please get in contact with us at regulation@statistics.gov.uk to let us know your thoughts and questions about these priorities, or if you would like a wider discussion with us.

Weights and measures: how consultations relate to OSR’s role

In our latest blog, Director General Ed Humpherson responds to concerns raised with OSR regarding the recent consultation by Department for Business, Energy & Industrial Strategy on the Choice on units of measurements: marking and sales.

This blog was amended on 4 October 2022 to provide further clarity about question design

Since the start of the Government’s consultation on weights and measures, a lot of people have raised concerns about it with us at OSR. At the time of writing, we have received over 150 emails and many others have tagged us on Twitter regarding the recent consultation by the Department for Business, Energy & Industrial Strategy on the Choice on units of measurements: marking and sales that closed on 26 August 2022.

Before considering the specific question that has been raised with us, let’s first set out some background to how consultations relate to our role.

Consultations, done well, are a vital tool for government in developing policy. They can provide specific and qualitative insight to complement evidence from other sources like official statistics. They also help Government to understand different perspectives, foresee possible consequences and gather expert advice on implementation.

Our remit focuses on statistics. We set the standards for how Government collects and uses statistics. Consultations are relevant to our work in two senses. First, consultations can often draw on statistics, for example to illustrate the scale of an issue a policy intends to address. And consultations can also create statistical evidence – for example, the number of people responding to a particular question. In these senses, consultations are relevant to our work as OSR.

Turning to this particular consultation, the aim was to identify how the Government can give more choice to businesses and consumers over the units of measurement they use for trade, while ensuring that measurement information remains accurate.

However, when the consultation was launched, many stakeholders raised concerns surrounding the consultation questions and in particular question 3a. The question was as follows:

If you had a choice, would you want to purchase items (i) in imperial units? or (ii) in imperial units alongside a metric equivalent.

There was no option to select an outcome without imperial units at all. People felt that this was a potentially biased way of collecting information on public views on changes to measurement.

Given the concerns raised with us, we approached BEIS as the Department conducting the consultation. We wanted to understand the reasons for this question. They advised us that UK law currently requires metric units to be used for all trade purposes, with only limited exceptions. The purpose of the consultation was to identify how they can give greater choice to businesses and consumers over the units of measurement they use to buy and sell products. BEIS did also say that respondents had multiple opportunities to give a range of views through the consultation, and that all responses would be carefully considered.

This explanation was helpful. But it still doesn’t make this a good question design because it doesn’t offer a complete range of responses. For example including a ‘none of the above’ option would have allowed respondents the opportunity to express a preference for a unit of measurement other than imperial or imperial with metric. In the absence of such an option, the design in effect makes this an overly leading question.

So, what happens next? We would be surprised to see the evidence drawn from this specific question being used in any kind of formal quantitative way. If the results of this poorly designed question were presented in statistical terms (x% of people said…etc), then this would represent the generation of a statistical claim from a consultation. And in our view it would be potentially quite misleading as a representation of levels of support for any proposed policy change.

How OSR secures change

Why do organisations do the things they do? Strip away all the language of business plans and objectives and strategies, and what it often boils down to is wanting to achieve some kind of positive impact.

It’s important to remember that as we launch our latest business plan for 2022/23. In this blog, rather than highlight specific outputs and priorities, I want to talk more generally about how OSR, as a regulator, secures positive change.

By change, we mean regulatory work or interventions that ensure or enhance statistics serving the public good. There are basically two ways in which our work leads to statistics serving the public good. Our work can:

  • secure a positive change in the way statistics are produced and/or presented; and/or
  • make a direct contribution to public confidence and understanding.

OSR clearly does secure impact. In response to our reports, producers of statistics make changes and improvements to their statistics and other data. Statistics producers also use the presence of OSR in internal debates as a way of arguing for (or against) changes – so that OSR casts a protective shadow around analytical work. OSR can also secure changes to how Ministers and others present data. And OSR also achieves impact through getting departments to publish previously unavailable data. In all these ways, then, OSR secures impact, in the sense of statistics serving the public good to a greater extent.

In terms of formal statutory powers, the main lever is the statutory power to confer the National Statistics designation. This in effect is a way of signalling regulatory judgement. The regulatory role is to assess, i.e. to review and form a judgement and on the basis of the judgement, that National Statistics designation is awarded. We have recently been reviewing the National Statistics designation itself.

A further power in the Statistics and Registration Service Act is the power to report our opinion. Under section 8 of the Act, we are expected to monitor the production and publication of official statistics and report any concerns about the quality of any official statistics, good practice in relation to any official statistics, or the comprehensiveness of any official statistics.

These statutory powers do not create influence and drive change by themselves. We need to be effective in how we wield them. We have to supplement them with a powerful vision, good judgement, and effective communication.

The power of ideas

The most significant source of influence and impact is the power of the ideas that underpin the Code of Practice for Statistics. The Code is built on the idea that statistics should command public confidence. It is not enough for them to be good numbers: collected well, appropriately calculated. They must have attributes of trustworthiness, quality and value.

The power of these ideas comes from two sources. First, they are coherent, and in the Code of Practice, are broken down into a series of increasingly granular components – so the ideas are easy for producers to engage with and implement. Second, they have enormous normative power – in other words, trustworthiness, quality and value represent norms that both statisticians and, senior staff want to be seen to adhere to, and wider users want to see upheld.

These powerful, compelling ideas represent, then, something that people want to buy into and participate in. A huge amount of OSR’s impact happens when OSR is not even directly involved – by the day-to-day work of statisticians seeking to live up to these ideas and the vision of public good that they embody.

Judgements

OSR’s work begins with the ideas embodied in the Code, which we advocate energetically, including through our crucial policy and standards function. The core work for OSR actually consists of making judgements about trustworthiness, quality and value, in multiple ways:

  • our assessments of individual sets of statistics, where we review statistics, form judgements, and present those judgements and associated requirements to producers and then publicly – either through in-depth assessment reports, or by quicker-turnaround reviews of compliance;
  • our systemic reviews, which address issues which cut across broader groups of statistics, and which often focus on how of we statistics provide public value, including highlighting gaps in meeting user needs;
  • our casework, where we make judgements about the role statistics play in public debate – whether there are issues with how they are used, or how they have been produced, which impact on public debate; and
  • through our broader influencing work, including our policy work, and our research and insight work streams.

These judgements are crucial. Our ability to move fluidly using different combinations of our regulatory tools is important to securing impact. It allows us to follow up where the most material change is required and extend our pressure – and support – for change.

We are able to make judgements primarily through the capability of OSR’s people. Their capability is strong, and we depend on their insight, analysis, judgement and ability to manage a range of external relationships.

Communication and reach

It is not enough for us to make good judgements. We need to make sure that any actions are implemented – in effect, that our judgements radiate out into the world and lead to change.

There are three main channels for achieving this reach:

  • Relationships with producers: our relations with producers are crucial. Heads of Profession and the lead statisticians on individual outputs are the key people for making improvements; their buy-in is crucial.
  • Voice and visibility: having a public voice magnifies our impact. It ensures that policymakers are aware of what we do and understand that our interventions can generate media impact.
  • Wider partnerships: while our direct relationships with producers, and our public voice, can also create sufficient leverage for change, we also draw on wider partnerships. For example, credible external bodies like the RSS and Full Fact can endorse and promote our messages – so that producers face a coalition of actors, including OSR, that are pushing for change.

And we put a lot of emphasis on the views, experiences and perspectives of users of statistics. Almost all our work involves engaging with users, finding out what they think, and seeking to ensure producers focus on their needs.

In that spirit, we’d be very keen to get reactions on our own business plan – from all types of users of statistics, and also from statistics producers.

Conclusion 

Business plans should not simply be a list of tasks. It is also important to be clear on how an organisation delivers, how individual projects and priorities help achieve a positive impact. In OSR’s case, achieving this impact involves the power of ideas, good judgement and effective reach.

With this clarity around impact, our business plan (and work programme) comes to life: more than just a set of projects, it’s a statement of ambition, a statement of change.

But the business plan is also not set in stone. We are flexible and willing to adapt to emerging issues. So if there are other areas where we should focus, or other ways we can make a positive difference, we’d really welcome your feedback.

Acting against the misuse of statistics is an international challenge

Acting against the misuse of statistics is an international challenge

That was the message of the event on 14 March hosted by the UK Statistics Authority on the prevention of misuse of statistics, which formed part of a wider campaign to celebrate the 30th anniversary of the UN fundamental principles.

My fellow panellists, Dominik Rozcrut and Steven Vale, and I discussed a range of topics, from addressing statistical literacy to regulation best practice, from memorable examples of misuse, to the cultural differences that affect public trust internationally. Although we all had different experiences and approaches, it was clear that there was a common passion for the truth and statistics that serve the public good.

The event had all the merits of on-line meetings that we’ve all become familiar with: lots of people able to join from a wide range of locations and lots of opportunities for people to engage using live chat functions.

Perhaps some people find it less intimidating to type a question into an app than to raise their hand in a crowded room, because there were lots of interesting questions asked and it was clear that the issue of preventing misuse of statistics generated a lot of interest and passion from the audience as well as the panellists

But the event also brought with it a new kind of frustration to me as a speaker: there were too many questions to answer in the time available and I felt bad that we couldn’t answer all the questions that people typed in.

So, in an attempt to rectify that, I’ve decided to use this blog to address the questions that were directly for me that I didn’t answer in real time, and those which touched on the key themes that came across during the Q&A.


“Who are the best allies in preventing misuse or building statistical literacy outside of stats offices? Are there any surprising allies?”

There are obvious allies for us, like Full Fact and the Royal Statistical Society.

I also like to give a shout out to the work of Sense about Science. Their work highlights that there is a huge amount of interest in evidence, data and statistics – and that a simple model of experts versus “the public” is far too simplistic.

There are a huge range of people who engage with evidence: teachers, community groups, people who represent patients, and a lot of others. These people, who want to find out the best evidence for their community, are fantastic allies.

And I’d also pick out a surprising ally: politicians. In our experience, politicians almost always are motivated to get it right, and not to misuse statistics, and they understand we are making the interventions we are making. So perhaps they are the ally that would most surprise people who look at our work.

“How important is statistical literacy among the media and general public in helping prevent the misuse of statistics?”

I think that having a sort of critical thinking skill is important. People should feel confident in the statistics that are published, but also feel confident that they know where to find the answers to any questions they have about them.

But equally, we need statistical producers to be better in how they communicate things like uncertainty, in a way that is meaningful for the public.

So rather than putting the responsibility of understanding solely on the user, and just talking about statistical literacy, let’s also talk about producers’ understanding – or literacy if you will – about public communication.

“You have mentioned that sometimes the stats get misinterpreted because of the way they are presented – can you share some examples?”

My point here was that misinterpretation is a consequence of what producers of statistics do. One example we’ve seen frequently during the pandemic concerns data on the impact of vaccines. It’s been the case that sometimes people have picked out individual numbers produced by public health bodies and highlighted them to argue their case about vaccines. Producers need to be alive to this risk and be more willing to caveat or re-present data to avoid this kind of misinterpretation.

“What are your views on framing statistics for example 5% mortality rate vs 95% survival rate? Both are correct but could be interpreted very differently.”

I find it impossible to answer this question without context, sorry! I definitely wouldn’t say that, as an absolute rule, one is right and the other is wrong. It depends on the question the statistician is seeking to inform. I can’t be more specific than that in this instance.

However, to avoid possible misinterpretation, we always recommend that producers use simple presentation, with clear communication about what the numbers do and do not say.

“How do we balance freedom of expression with the need to prevent the abuse and misuse of statistics?”

We don’t ban or prohibit people from using statistics, so in that sense there’s no barrier to freedom of expression. But we do want to protect the appropriate interpretation of statistics – so our interventions are always focused on clarifying what the statistics do and don’t say, and asking others to recognise and respect this. It’s certainly not about constraining anyone’s ability to express themselves.

“What’s the most damaging example of misuse of statistics that you’ve come across in your career?”

Here I don’t want to give a single example but give a type of misuse which really frustrates us. It’s when single figures are used as a piece of number theatre, but the underlying dataset from which the single figure is drawn are not available, so it’s not possible for the public to get to understand what sits behind the apparently impressive number. It happens a lot, and we are running a campaign, which we call Intelligent Transparency, to address it.

“Can you give us some more insight into how you steer clear of politics, media frenzies, and personalities?”

We always seek to make our intervention about clarifying the statistics, not about the arguments or policy debates that the statistics relate to. So we step in and say, “this is what the statistics actually say” and then we step out. And we don’t tour the news studios trying to get a big name for ourselves. It’s not our job to get media attention. We want the attention to be on what the statistics actually say.


I hope these answers are helpful, and add some context to the work we do to challenge the misuse of statistics. I also hope everyone reading this is going to follow the next series of events on the UN Fundamental Principles of Official Statistics.

The next round of events is moving on from preventing misuse, to focusing on the importance of using appropriate sources for statistics. Find out more about them on the UNECE website.

Why I love evaluation

In our latest blog, Director General Ed Humpherson looks at the importance of government evaluation in light of the Evaluation Task Force ‘Policy that Works’ conference.

There are lots of good reasons to love evaluation. It provides evidence of what works; it supports good policy; it builds the skills and reputation of analysts; it helps scrutiny. 

But I love evaluation for another reason too. Evaluation, done well, is fundamental to changing the way government in the UK works. 

This week the Evaluation Task Force is running one of the biggest ever conferences on government evaluation. So it’s a great time to celebrate evaluation, and my reasons for loving it.  

In a recent speech, Bronwen Maddox, the Institute for Government’s Director, set out a compelling case for why government needs to change. She highlighted the challenges of rotation of officials, moving on for career advancement so they don’t build grounded expertise in their subject matter. She talked of a lack of accountability. And she said these things combined to create an air of “unreality” to the way Government approached key challenges.  

In some ways this critique is exaggerated. There are lots of examples of officials with grounded expertise, taking responsibility for their decisions and implementation, and understanding the realities of the policy problems they are addressing. But there are enough cases where the critique is fair for us all to take it seriously. I saw it when I was looking at the value for money of Government programmes when I was at the National Audit Office. And I see it now in our work at the Office for Statistics Regulation. 

Evaluation is for me a great antidote to these problems. By committing to evaluation, as it is doing through the Evaluation Task Force, Government is investing in corporate memory. It lays down the groundwork for a commitment to gathering and retaining evidence of what works, and, crucially, how and why it works. By committing to evaluation, Government is creating an intelligent form of accountability – not the theatre of blame and finger-pointing, but of a clear-headed consideration of what has actually happened as policy is implemented. And by a relentless focus on real-world evidence, evaluation combats Maddox’s air of unreality. 

It aligns with a lot of what we champion at the Office for Statistics Regulation. We have emphasised the importance of analytical leadership in Government – how Government and society benefits when the analytical function is not simply a passive provider of numbers, but a key partner in the work of Government. And this requires the analytical function to be full of leaders, who can articulate the benefits of analysis, and make it relevant and useful – not just to policy makers but to a wider public.  

And we champion public confidence in data produced by Government. This public confidence is secured by focusing on trustworthiness, quality and value. 

Analytical leadership, and the triumvirate of trustworthiness, quality and value, are central to securing the benefits of evaluation. Analytical leadership matters because to do great evaluation requires clarity of vision, strong objectives, and long-term organisational commitment. 

And trustworthiness, quality and value are central to good evaluation: 

  • Trustworthiness is having confidence in the people and organisations that produce statistics and data – being open to learning what works and what doesn’t, and open about the use of all evaluation, giving advance notice about plans and sharing the findings. The commitments to transparency that the Evaluation Task Force is making are crucial in this regard.
  • Quality means data and methods that produce assured statistics – it ensures the evaluation question is well defined and the right data and robust methods are selected, tested, and explained.
  • Value supports society’s needs for information – it means evaluation can be impactful, addressing the right questions and ensuring the correct understanding of the evidence.

Of course, I don’t claim for a second that evaluation is the sole or perfect panacea for the challenges of government. That too would be an exaggeration. But I do think it has tremendous potential to help shift the way government works. Led in the right way, and adhering to the principles of TQV, evaluation can make a big difference to the way government operates. 

That is why I applaud the energy and focus of the Evaluation Task Force, which has galvanised interest and attention. It’s why I like the Evaluation Task Force’s website and why I celebrate this week’s conference. 

And it is why I love evaluation. 

 

Letting the good data shine: The state of the UK system of statistics 2021

At OSR we’ve long been concerned about the risks that a world of abundant information and misinformation could lead to a catastrophic loss of confidence in statistics and data – and that our public conversation, cut loose from any solid foundations of evidence and impartial measurement, would be immeasurably weakened as a result. That is, at root, what we exist to prevent. I have written about this before as a form of statistical Gresham’s Law – how the risk is that the bad data drive out the good, causing people to lose confidence in all the evidence that’s presented to them in the media and social media.

I’ve also said that this is not inevitable, and indeed we can easily envisage a reverse effect: the bad data being driven out by the good data – that is, the trustworthy, high quality, high value data.

What it takes to secure this positive outcome is a public sector statistical system focused on serving the public good. A system that does not regard official statistics as just a Number, shorn of context, calculated in the way it always has been done, some kind of dusty relic. Instead a system that regards the production of statistics as a social endeavour: engaging with users of statistics, finding out what they want and need to know, and responding in a flexible and agile way to meet those needs.

The pandemic has really tested the public sector statistical system and its ability to provide the good data, the trustworthy, high quality, high value data. The pandemic could have seen us being overwhelmed with data from a wide range of sources, some less reliable than others. It could also have seen Government statistics retreating to the Just a Number mindset – “we just count cases, it’s up to others to decide what the numbers mean”. But the system has not done this. Instead, as our report published today shows, the statistical system has passed the test with flying colours.

Statistical producers (producers) across the UK nations and system-wide have responded brilliantly. They have shown five attributes. It’s easy to see these attributes in the work of public health statisticians and ONS’s work on health and social care statistics. They have done great things. But what’s clear to us is that these attributes are system-wide – appearing in lots of producers of statistics and across lots of statistical areas.

 Responsive and proactive

Producers across the UK governments have been responsive, proactive and agile in producing data and statistics to support policy and to provide wider information to the public which really adds value. For example the ONS launched the Coronavirus (COVID-19) Infection Survey in swift response to the pandemic. The survey provides high-quality estimates of the percentage of people testing positive for coronavirus and antibodies against coronavirus. These statistics provide vital insights into the pandemic for a wide range of users, including government decision-makers, scientists, the media and the public that are essential for understanding the spread of the virus, including the new variants.

 Timely

Producers have responded impressively to the need for very timely data to ensure that decisions around responses to the pandemic are based on the most up-to-date evidence. For example, the ONS published the first of its weekly Economic activity and social change in the UK, real-time indicators (previously called Coronavirus and the latest indicators for the UK economy and society) publications in April 2020, one month after the UK first went into lockdown and has continued to do so ever since. The publication contains a series of economic and social indicators (for example, card spend, road traffic and footfall), which come from a variety of different data sources. These assist policymakers with understanding the impact of the pandemic and gauging the level of overall economic activity. During the early weeks of the Covid-19 pandemic, the Department for Transport rapidly developed near-to-real-time statistics about Transport use during the coronavirus (COVID-19) pandemic. The statistics were regularly used by No10 press conferences (example in slide 2) to show the change in transport trends across Great Britain and gave an indication of compliance with social distancing rules.

Collaborative

Collaboration and data sharing and linkage have been a key strength of both the UK statistical system and the wider analytical community over the past year. This more joined-up approach has improved our understanding of the impact of the pandemic both on public health and on wider areas such as employment and the economy. For example, during the pandemic, ONS and HMRC accelerated their plans to develop Pay as You Earn (PAYE) Real Time Information (RTI) estimates of employment and earnings. The Earnings and employment from PAYE RTI is now a joint monthly experimental release that draws from HMRC’s PAYE RTI system which covers all payrolled employees and therefore allows for more detailed estimates of employees, rather than a sample based approach, as well as information on pay, sector, age and geographic location.

Clear and insightful

We have seen some good examples of clearly presented and insightful statistics which serve the public good. For example, Public Health England (PHE) developed and maintain the coronavirus (COVID-19) UK dashboard. This dashboard is the official UK government website for epidemiological data and insights on coronavirus (COVID-19). The dashboard was developed at the start of the pandemic to bring together information on the virus into one place to make it more accessible. Initially it presented information for the UK as a whole and for the four UK countries individually. Over time it has developed so that data are now available at local levels. We have seen the increasing use of blogs to explain to users how the pandemic has affected data collection, changes to methodologies and bring together information available about the pandemic. For example, the Scottish Government has blogged about analysis and data around COVID-19 available for Scotland. We have also seen examples of statisticians engaging openly about data and statistics and their limitations, both within and outside government, helping the wider understanding of this data and statistics. For example, Northern Ireland Statistics and Research Agency (NISRA) statisticians have introduced press briefings to explain their statistics on weekly deaths due to COVID-19. The Welsh Government Chief Statistician’s blog is a regular platform for the Chief Statistician for Wales to speak on statistical matters, including providing guidance on the correct interpretation of a range of statistics about Wales.

Transparent and trustworthy

For statistics to serve the public good they must be trustworthy, and this includes statistics being used and published in an open and transparent way. We have seen efforts to put information in the public domain and producers voluntarily applying the Code of Practice for Statistics (‘the Code’) to their outputs. For example, the Department of Health and Social Care (DHSC) publishes weekly statistics about the coronavirus (COVID-19) NHS test and trace programme in England. DHSC has published a description about how the pillars of the Code have been applied in a proportionate way to these statistics. However, inevitably the increased volume of and demand for data has placed a greater burden on producers and led to selected figures being quoted publicly when the underlying data are not in the public domain.

But our report also shows how the system cannot take these 5 attributes for granted. What has been achieved in the high pressure environment of a pandemic must be sustained as we ease out of being a pandemic society. New challenges – like addressing regional imbalances, or moving to a greener economy or addressing issues like loneliness and inequality – cannot be understood using objective statistics if the system retreats into the Just a Number mentality.

So, our report sets a number of recommendations. The recommendations aim to make sure that the statistical system we have seen in the pandemic is not an aberration, but is – in the classic pandemic phrase – the new normal. A system that can harness these five attributes is one that serves the public good. It is the best way to ensure that the bad data do not thrive and the good data shine out.

 

 

 

 

Glimmers of light for adult social care statistics

I was very interested in a recent Social Finance report on how to secure impact at scale. One of their points is that, if you want to see impact at scale, you need to be willing to seize the moment. Change arises when supportive policies and legislation fall into place, and when a new public conversation starts.

This idea – new policies, and new public conversations – made me think of social care statistics. It’s very tragic that it has taken the disastrous impact of the pandemic in care homes to focus attention on this issue, but there seems to be some potential for progress on the statistics now.

The background is that we’ve been shouting about the need for better statistics for the last couple of years. We’ve done so through reports on social care statistics in England , Scotland and Wales . We’ve done it through presentations and I’ve taken the opportunity to highlight it when I’ve given evidence at the Public Administration Committee in the House of Commons.

Certainly, we found some excellent allies in Future Care Capital and the Nuffield Trust, yet it has sometimes felt like we’re in the minority, shouting in the wilderness.

What were our concerns? Our research in 2020 highlighted that there were several challenges and frustrations related to adult social care data that were common to England, Scotland and Wales. Our report summarising the common features of the statistics across Great Britain highlighted four key needs to help both policymakers and individuals make better informed decisions about social care:

  • Adult social care has not been measured or managed as closely as healthcare, and a lack of funding has led to under investment and resourcing in data and analysis.
  • There is an unknown volume and value of privately funded provision of adult social care. Although data is collected from local authorities, this only covers activities that they commission and fund, which constitute a smaller proportion of total adult social care activity.
  • Robust, harmonised data supply to ensure comparable statistics from both public and private providers is problematic, as data collection processes are not always standardised. Furthermore, data definitions might not always be consistent across local authorities and other providers.
  • Data quality is variable within and across local authorities, with inconsistent interpretation of data reporting guidance by local authorities. This means that data isn’t always reliable and so users have difficulty trusting it.

As data issues go, as the pandemic has highlighted, there is not so much a gap as a chasm, with consequences to our understanding of social care delivery and outcomes.

Most people we’ve talked to, inside and outside the UK’s governments, recognise these issues. But to date there hasn’t been much evidence of a sustained desire to inject energy into the system to effect change.

Maybe, though, there are glimmers of light. Whilst this list is not meant to be exhaustive, I would like to draw attention to some initiatives that have caught my eye.

  • The first comes from an extremely negative space. That is the pandemic’s impact on those in care homes. Not only has the pandemic highlighted the importance of care and care workers, it has also led to much more interest in data about the care home sector. The Care Quality Commission and the Office for National Statistics (ONS) collaborated to publish timely information on the numbers of deaths in care homes , to shine a light on the impact of the pandemic for this vulnerable population. And DHSC has commenced the publication of a monthly statistics report on Adult social care in England to fill a need for information on the social care sector itself. This means that COVID-19 has resulted in people listening to analysts and statisticians when we raise the problem with social care data now. Of course, the questions people are interested in go well beyond COVID-19.
  • The Department for Health and Social Care’s draft data strategy for England makes a significant commitment to improving data on adult social care.
  • The Goldacre Review for data in England may present a further opportunity for improvement.
  • I was pleased to restore the National Statistics designation to the Ministry of Housing, Communities and Local Government’s statistics report about local authority revenue.
  • Beyond the pandemic, ONS is working in collaboration with Future Care Capital to shine a light on one of the biggest data gaps here: the private costs borne by individuals and families for care. And ONS has recently published estimates of life expectancy in care homes prior to the pandemic.
  • Adult social care remains high on the political agenda in Scotland, with the recently published independent review of adult social care by the Scottish Government and the inquiry by Scotland’s Health and Sport Committee.
  • The Welsh Government remains committed to improving the data it captures on social care .

It’s far too early to declare “problem solved”, but we ought to be optimistic about improvements to data as a consequence. We’ll be reviewing the actions currently underway as statistics producers react to the gaps in social care statistics highlighted by the pandemic and publishing a comprehensive report of our findings in the autumn.

What I do think is that there is an opportunity here – if statistics producers across the UK are willing to take it, we can anticipate much better statistics on this sector. And a much better understanding of the lives and experiences of citizens who receive, and provide, social care.

OSR in the pandemic and beyond: Our year so far

The first half of 2021 has seen further lockdowns, an impressive vaccination rollout and as we ease into the Summer, some easing of restrictions across the UK. It’s also been a busy time for us, as we continue to push for the production of official statistics, and other forms of data, that serve the public good. We really feel that public good matters more than ever.

We recently published our business plan for 2021/22 in which we outline our focus for the statistical system over the coming year and how it can consolidate the huge gains made in data collection and publication. We have also made progress we have made on our role in data. Our review and findings for developing statistical models to award 2020 exam results may well be the most comprehensive review of the 2020 exam story. It’s comprehensive in two senses. It covers all four parts of the UK, unlike other reviews. And it goes beyond technical issues about algorithms to identify lessons for all public bodies that want to use statistical models in a way that supports public confidence. We have also published an insightful review on Reproducible Analytical Pipelines and our research programme.

The use of statistics during the pandemic

Statistics have and will continue to play an important and extremely visible role in all our lives. I recently provided evidence to the inquiry run by the House of Commons Public Administration and Constitutional Affairs Committee on the use of data during the pandemic. Since the start of the pandemic, Governments across the UK have maintained a flow of data which has been quite remarkable. We continue to push for further progress, for example on vaccination data.

Statistical Leadership

One thing that the pandemic has highlighted is how important it is for leaders to be analytical, and this is something that our Statistical Leadership report published recently highlights.

Good analytical leadership will be crucial to answering the many questions that have arisen over the course of the pandemic and continue to come to light including the importance of transparency. We are currently planning an in-depth discussion on these issues and more for our second OSR annual conference, which we aim to host later this year, focusing on high quality data, statistics and evidence.

Looking Forward

There are lots of good things happening for statistics at present. I was delighted to see changes to pre-release access in Scotland because equality of access to official statistics is a fundamental principle of statistical good practice.

I am also really looking forward to announcing the results of our 2021 annual award for Statistical Excellence in Trustworthiness, Quality and Value in July. This is the second year we have worked in partnership with the Royal Statistical Society to offer the award.

Keep up to date with our latest work and news by following us on Twitter, and sign up to our monthly newsletter.

Can we make trust go viral? Increasing public trust in a digital age

It’s roughly a year ago that I gave evidence to the House of Lords Democracy and Digital Technologies Committee.

A lot has happened in a year. I did things at the evidence session that don’t happen now. The Committee session took place in person, not over Zoom. I shook hands with the other witnesses. Afterwards I chatted with several people, including the Committee Chair, Lord Puttnam, and gave no thought as to whether we were two metres apart.

But looking back on the Lords report, published in June last year, it’s clear that the report remains highly relevant.

The Committee was grappling with one of the challenges of our age: whether or not norms of democracy and democratic debate are under threat from digital technologies, social media and so on. In particular, if it is true that false information can circulate more freely in social media than they did in the past, does that erode one of the bedrocks of democracy? Lord Puttnam’s foreword vividly describes this as another kind of virus:

“Our Committee is delivering this Report to Parliament in the middle of an unprecedented health and consequential economic crisis. But our Report focuses on a different form of crisis, one with roots that extend far deeper, and are likely to last far longer than COVID-19. This is a virus that affects all of us in the UK – a pandemic of ‘misinformation’ and ‘disinformation’. If allowed to flourish these counterfeit truths will result in the collapse of public trust, and without trust democracy as we know it will simply decline into irrelevance.” Foreword by Lord Puttnam, Chair of the Committee

Empowering citizens

The Committee’s report made a range of thoughtful recommendations. A lot of them cover legislative and regulatory change, around how digital platforms operate and electoral laws should adapt to the digital era.

And they align well with the principles that underpin our work at the Office for Statistics Regulation (OSR) and the Code of Practice for Statistics.

This is because the report highlights the importance of information that serves the public good. This includes empowering citizens:

Alongside establishing rules in the online world, we must also empower citizens, young and old, to take part as critical users of information. We need to create a programme of lifelong education that will equip people with the skills they need to be active citizens.

There is growing interest in this notion of digital or statistical literacy. The recent ESCoE report on understanding economic concepts has grown awareness of issues around public understanding and engagement in economic matters. Meanwhile, OSR’s own review of literature on the public good of statistics found that a lack of statistical literacy can lead to misunderstanding or misuse of statistics, and susceptibility to misinformation.

The report also touches on the public’s need for high quality information. The report places journalism at the centre of this: “The public needs to have access to high quality public interest journalism to help inform them about current events.”

We agree with the crucial role of journalism. But journalism is often playing the role of intermediary between Government and citizens. And if citizens should be equipped to interpret information, so too the Government has a crucial role to play in providing clear, accessible, high quality information.

Statistics that serve the public good

The pandemic has shown this repeatedly. The UK’s four governments have put an enormous emphasis on providing accessible data on cases, testing, deaths, hospitalisations, and now vaccines. All four governments have improved the range and depth of data available, and all have sought to comply with the Code of Practice for Statistics. There have been occasions where we’ve pointed out ways in which the information can be better presented, or more information made available. But that’s against the backdrop of commitments to inform the public. And it’s clear that this information is used and valued by the public.

Data and evidence has remained prominent for Parliament during the pandemic. The Commons Science and Technology Committee report recently published a strong report on The UK response to covid-19: use of scientific advice. And the Public Administration and Constitutional Affairs Committee is currently undertaking an inquiry into the transparency and accountability of Covid-19. Both of these Committees have drawn on our work at OSR to emphasise the importance of presenting data well. The Science and Technology Committee summarised the issues thus:

As the Office for Statistics Regulation advised, in order to maintain high levels of confidence, data and statistics should be presented in ways that align with high standards of clarity and rigour—especially when they are used to support measures of great public impact.” Key finding 6, Science and Technology Committee

The Government’s role in informing the public

My reflection, then, looking back over the last year and at the Lords report, is how important the Government’s role is in informing the public.

As I said to the Lords Committee:

What keeps me awake at night is that so many official statistics have the potential to be valuable assets, not just for the policy elites, but for the public more broadly. But that potential is unrealised, because they are presented and communicated in a dry and almost mechanical way: we did this survey and got these results. They are not presented in a way that engages and links into the things that people are concerned about. I worry much more about that than I do about the more pervasive concerns people have about misinformation. I worry about the good information not getting out there. Q93, Select Committee on Democracy and Digital Technologies

The enduring lesson of the last 12 months is this: providing information that is trustworthy, high quality and high value to the public is a crucial function of Government.

OSR in 2021

2020 was a year in which statistics and data were at the centre of public life. We became a nation of armchair epidemiologists.

There’s no sign that this will change in 2021. We expect statistics to remain in the public eye with the pandemic continuing to effect all aspects of life, a national COVID-19 vaccination programme, EU departure, the decennial Census, and Scottish and Welsh elections.

Statistics and evidence remain crucial to managing the current pandemic and understanding its impacts on the economy and society. This meant that 2020 was very busy for us at the Office for Statistics Regulation.

Our Annual Casework report gave some indication of the noticeably higher volume of ‘use of statistics issues’ being reported to us, as questions and queries about numbers and their uses increased in what proved to be our busiest year to date. In the light of this experience, at the end of 2020 we updated our Interventions Policy and supporting FAQs which clarifies the context for interventions that we have made and will make going forward.

At OSR we cover a huge range of issues. Our 2020 assessments and reviews illustrate this. We highlighted the need for significant improvements in the statistics, including adult social care, employment, poverty, and business demography to name a few.

OSR in 2021

We will continue to review the use of statistics in the pandemic, and have some high profile reports due for publication, including our review on developing statistical models to calculate exam grades, one of the most high profile examples of statistics in the public eye in recent times. We will also publish our report on statistical leadership in the public sector.

Our work in 2021 will include:

Data gaps and the public good

These are ongoing issues facing the statistical system, and we will continue to focus on data-gaps, public good of statistics, automation modelling and much more.

Granularity of statistics

We will clarify our expectations of producers to provide more granular data, including data broken down by key characteristics and geographies. It is important that statistics reflect society and that people can see themselves in the statistics available.

Our role in data

We will be developing our thinking on our role in regulation of data. This will build on projects underway, including the exam grade review, and help us determine how far and how regularly we want to focus our regulatory work beyond official statistics. We will also seek to maintain trust in National Statistics and continue to focus on this core aspect of our regulatory work programme.

Quality

The pandemic has led to significant changes to how data are produced and what is available, and the departure from the EU may have significant impacts too. We will consider the implications of some of the decisions being made and what this means for our judgement of quality of statistics. One area we will consider for an early review following EU departure is ONS decisions on the classification of organisations to the public and private sector. There are new governance arrangements covering these decisions and it is important that the arrangements are effective and command user confidence.

We will continue to support producers of statistics

Our role during the pandemic has been to support producers, look at issues that have arisen and promote trust and transparency in statistics and data that support decision making. This work has seen us produce guidance, publish statements and get involved in various issues such as impacts on the care sector, local area data, test and trace, rough sleeping, comparisons and prevalence rates, measurement related to deaths, and hospital capacity and occupancy.

And we will continue to grow the use of the Code of Practice. As the boundaries between official statistics and other forms of data come down, we have also seen many more committed and engaged organisations adopt the Code of Practice for Statistics and successfully created a new award in partnership with the Royal Statistical Society to recognise outstanding examples of how well this can work within our community of practice.

To conclude, there is much to learn from how the pandemic has impacted on statistics and data. In 2021 we will continue to look at the statistics system to build on the dynamism and innovation it has displayed during the pandemic as new areas of demand for insight emerge.

Ed

 

To keep up to date with our latest work, you can follow us on Twitter and sign up to our monthly newsletter.