Making an impact: How can statisticians help journalists to answer key questions with data?

As part of our work on statistical leadership, we are hosting a series of guest blogs. This blog is from Robert Cuffe, Head of Statistics at the BBC.

It’s not all rush, rush, rush in news. One colleague, helping me with my transition from the months-away-deadlines of clinical research to the five-minute-deadlines of breaking news, revealed that when he worked on the evening TV news he had often had the luxury to go away and “just think about a story for ages, like, easily 15 minutes”. It didn’t help much.

Does it need to be so frenetic? And, if so, what does that mean for departments or statisticians trying to get their data covered broadly and correctly?

We can debate about five minutes versus fifteen, but the news machine does need to work very quickly. Editors are the last line of defence between everything that happened in the world recently and your news feed.

Every select committee, every ministerial pronouncement or opposition line, every report from a charity, think tank or government statistician, every freak goal from any team anywhere in the world, every man who bit a dog, every new collection, every sleb indiscretion and for quite some time almost every major number about the pandemic gets reviewed and someone has to decide “does this make”?

If it does, do we need to send a video crew to capture footage and interview bystanders, how quickly can we get 400 words up online to summarise this, who are good voices to put this into context, does the final script or writeup need to be run by the duty lawyer? And so on.

And, if not, what else can we put in the paper or in the bulletin? Because the program is happening at 6pm (or the print run starts at 7pm) whether or not we’re ready.

So decisions have to be made pretty quickly. And fast, good decisions about data need a prominent, clear and succinct summary of what the data can or can’t say.

As statisticians, we understand better than anyone else what the data can say. But our deep understanding of and intuition for the principles of our specialism do not always help with the prominent, simple summary. There are professionals who specialise in that translation: journalists or communications officers.

Here are some good guidelines given to me by an excellent journalist, who wanted me to help editors understand (in reverse order)

  • What are the caveats?
  • How prominently do we need to alert the reader/listener/viewer to them?
  • Can we run the story with this top line (or a different top line that the caveats don’t undermine)?

I’ve learnt to choose my caveats carefully: pushing hard on some and forgetting completely about others. Almost every number comes with a margin of error, but not every number story needs to include a confidence interval at the top. If the CI contains the null, why run the story? And if the bottom of the CI is miles away from the null, who cares that the truth could be a teeny bit higher or lower?

Equally, getting a key caveat, say “the time series jumps suddenly because the data definitions have changed” on page 28 won’t do when decisions or press releases are based on the executive summary.

So getting in the room for the painful discussion of final sign-off on a report or a press release, when much of the meeting feels like it’s about where the semi-colon goes, can be the most valuable contribution a statistician can make to the public understanding of their data. The day before the rush, there’s more time for discussion. Prepping the ground by explaining in advance to journalists or trusted intermediaries which questions the data can and can’t answer (rather than talking about specifically what answers the data contain) is also enormously valuable. Communication should be right first time. Statisticians can help make that happen.

PS –Delivering data during the pandemic is no mean feat. Trying out improvements to the service or data collection, as many have done, is remarkable. Thank you to all those producing statistics for all the work you have done in what has been an awful year for so many.

 

How we won our Award for Statistical Excellence

As the deadline for applications for our Statistical Excellence award nears, the joint-winners of the 2020 award, the Scottish Fiscal Commission, have written a guest blog about how they won and what the award means to them. Apply for the 2021 Award for Statistical Excellence in Trustworthiness, Quality and Value by 30 April 2021.

What is the Scottish Fiscal Commission?

We are Scotland’s independent fiscal institution (IFI). The Commission was created to serve the needs of increased fiscal devolution in Scotland. We are a young organisation, established in 2017. Our official economic and fiscal forecasts are used by the Scottish Government to formulate the Budget, the Scottish Parliament to scrutinise the Budget, and stakeholders and the media to inform public debate. We are not considered an official statistics producer, but we made an active choice to apply the Code of Practice for Statistics wherever possible.

The Code and the award

Since our creation, we followed the majority of the Code of Practice for Statistics principles. But in March 2018, we published a statement setting out our ongoing commitment to voluntarily apply the Code.

Last year, following encouragement from former team members, Mairi Spowage and Laura Pollitt, we decided to apply for the award. It was the first year the category had been created so we had no template to go on. Although the Commission is not an official source of statistics, we’re committed to upholding the pillars of the code – trustworthiness, quality and value – so it was just a question of evidencing how we do this.

We outlined our processes which include routinely pre-announcing our publications; helping data providers to plan their work by publishing an annual Statement of Data Needs, ensuring we accompany our publications with underlying data so others can use them and by communicating our work as widely as possible via our website, social media and events.

We also outlined our success in developing a reputation for delivering independent and credible forecasts, for which we were recognised by the OECD in an independent review published in 2019.

Silvia Palombi from the SFC holding the awardWe were delighted to receive the award. We know that our stakeholders and ultimately the public depend on us to uphold the highest standards of statistical practice as a matter of course, so we were grateful to the Royal Statistical Society and the Office for Statistics Regulation for the opportunity to have our work endorsed in this way.

Any civil service organisation voluntarily applying the Code of Practice for Statistics to their work can apply for this year’s award. You can see the full list of organisations who have committed to apply the Code and more information about applying the Code on the Code of Practice website.

 

2021 Award

Apply for the 2021 Award for Statistical Excellence in Trustworthiness, Quality and Value by 30 April 2021.

Securing public confidence in algorithms – lessons from the 2020 exam awards

Last March, schools and colleges across the UK were closed and the qualification regulators (Ofqual, Qualification Wales, SQA and CCEA) were directed by the respective governments of England, Wales, Scotland and Northern Ireland to oversee the development of an approach to awarding grades in the absence of exams. While each regulator developed a different approach, all approaches involved statistical models or algorithms.

When grades were awarded to students last summer there was intense media coverage across the UK. We read headlines such as, “Futures sacrificed for the sake of statistics” and statements implying that a “mutant algorithm” was to blame. The decisions that were made based on the calculated grades had a significant impact on many children’s lives. Up until last summer, most people had probably never felt personally affected by a government algorithm before.

Statistical models and algorithms though are increasingly becoming a part of normal life. As technology and the availability of data increases, developing these types of models in the public sector can play a significant role in improving services for society.

We are concerned that public confidence in the use of statistical models has been damaged by the exam processes last year. As Ed Humpherson, our Director General, said recently, “Public debate about algorithms veers between blind optimism in their potential and complete rejection. Both extremes have been very evident, at different times, in the discussion of the use of models to award exam grades.”

Our review

In the post 2020 exams world, the risk of a public backlash when public sector decisions are supported with an algorithm feels much more real now – regardless of how ‘right’ the algorithm may be.

As the UK regulator of official statistics, it is our role to uphold public confidence in statistics. We believe that statistical models have an important role to play in supporting decision making and that they can also command public confidence. It is because of this that we reviewed the approaches taken last year to award grades, to get a better understanding of what happened and, most importantly, what others can learn from the experience. To us it was striking that, though the approaches and models in England, Wales, Scotland and Northern Ireland had similarities and differences, all four failed to command public confidence.

Our recently published report explores what we found. We acknowledge the unique situation that the regulators were working in which was far removed from their normal roles. We conclude that many of the decisions made supported public confidence. To support learning for others, we have also clearly highlighted in the report the areas where we feel different choices could have been made.

We found that achieving public confidence is about much more than just the technical design of the model. It is also not just about doing one or two key things really well. It stems from considering public confidence as part of an end-to-end process, from deciding to use a statistical model through to deploying it.  It is influenced by a number of factors including the confidence placed in models, the extent of public understanding of the models, the limitations of the models and the process they were replacing, and the approach to quality assuring the results at an individual level.

Lessons for those developing statistical models

We’ve identified 40 lessons for model developers which support public confidence. These fall under the following high-level principles:

  • Be open and trustworthy – ensure transparency about the aims of the model and the model itself (including limitations), be open to and act on feedback and ensure the use of the model is ethical and legal.
  • Be rigorous and ensure quality throughout – establish clear governance and accountability, involve the full range of subject matter and technical experts when developing the model and ensure the data and outputs of the model are fully quality assured.
  • Meet the need and provide public value – engage with commissioners of the model throughout, fully consider whether a model is the right approach, test acceptability of the model with all affected groups and be clear on the timing and grounds for appeal against decisions supported by the model.

Lessons for policy makers who commission statistical models

Model developers should not work in isolation. They should collaborate with others throughout the end-to-end development process. It is important that policy makers who commission models are part of this wider collaboration.

A statistical model might not always be the best approach to meet your need. Model commissioners need to be clear what the model is aiming to achieve and whether it meets the intended need, understanding the model’s strengths and limitations and being open to alternative approaches.

Statistical models used to support decisions are more than just automated processes. All models are built on a set of assumptions. Commissioners of models should ensure that they understand these and provide advice on the acceptability of the assumptions and other key decisions made in the model development.

The development of a statistical model should be regarded as more than just a technical exercise.  Commissioners need to work with model developers throughout the end to end process and have regular reviews to check the model will meet the policy objective.

Lessons and recommendations for the centre of government

We also looked at the big picture, at the wider infrastructure that is in place to support public bodies working with statistical models. Looking at this, we found that public bodies developing models need more guidance and support and that this should be easier to access.

There are lots of different organisations in the statistical modelling, algorithm and AI space. As a result, it is not always clear what guidance is relevant to whom and where public bodies can go to for support. Some of this is down to inconsistencies in the terminology used to describe models, and some is due to it simply being hard to find out who is doing what in this very fast-moving area.

At the highest level, we feel that clearer leadership is needed from government. We are calling on the Analysis Function and Digital Function, working with the administrations in Scotland, Wales and Northern Ireland, to ensure that they provide consistent and joined up leadership on the use of models.

To support this, we recommend that those working in this area collaborate and develop a central directory of guidance. We see the Centre for Data Ethics and Innovation having a key role in this space. In addition, we are recommending that more guidance is developed, particularly for those wanting to test the public acceptability of their models.

And lastly, we recommend that any public body developing advanced statistical models with a high public value should consult the National Statistician for advice and guidance.

Our role going forward

This review supports the work of our Automation and Technology work program. Through this programme we will be clarifying our regulatory role when statistical models and algorithms are used by public bodies. We are working on guidance about models and AI in the context of the Code of Practice for Statistics and advocating for the use of automation within statistical production. We will continue to work with other organisations to support the implementation of the review findings.

We are happy to discuss the findings of our review further.  Please get in touch with us.

Listen to your enthusiasts: Implementing RAP at Public Health Scotland

This is a guest blog from Scott Heald following the launch of our new report: Reproducible Analytical Pipelines (RAP): Overcoming barriers to adoption.

Firstly, let me introduce myself. I’m Scott, the Head of Profession for Statistics at Public Health Scotland (PHS). PHS is a new body, formed in April 2020 (to lead on tackling the challenges of Scotland’s public health but dominated by the COVID-19 pandemic in our first year). Our RAP journey started when we were in NHS Scotland’s Information Services Division (ISD), the health and care statistics body which now forms part of PHS.

PHS, and ISD before it, has been a big fan of RAP from the beginning. I wanted to share our story, from a bunch of enthusiastic statisticians who convinced me it was the right thing to do (I didn’t need much convincing!), to embedding it within our organisation in our reporting on COVID-19.

Our RAP journey began with a programme of work to transform how we published our statistics.  It quickly became clear that the programme had to be as much about our processes for producing statistics, not just the final published output. More automation was key – to speed up processes, eliminate manual errors, and to release capacity to add value to our statistics. Greater job satisfaction for our statisticians was a welcome impact too.

“We don’t use that here”

Up till this point our software of choice was propriety standards. More and more graduates were joining our organisation and wondering why we weren’t using open-source software like R, having been taught it at university. I guess, in those early days, I was probably part of the “we don’t use that here” (partly out of fear as I had not personally used any of the new software they were talking about).

However, I was persuaded (and willing) for a group of our statisticians to show us what could be done using R. Long story, cut short, is that our group of enthusiasts showed the power of what could be done and PHS is now making the strategic shift to follow RAP principles as a way of working.

The art of persuasion

I’d describe our RAP journey as “bottom up”, with support from the “top down”. When we started seeing the results, we didn’t need much convincing. OSR’s report showcases our work on Hospital Standardised Mortality Ratios – a quarterly publication which used to take five days to run (lots of detached processes, lots of room for error). I remember vividly the first time the team ran it the RAP way. Five minutes after the process started, the finished report was in my inbox. We couldn’t believe it! And, to be sure, spent the next five days running it the old way to make sure we got the same answers (we more or less did; the RAP way was more accurate, highlighting a few errors we fixed along the way!).

Our learning is that it’s a relatively easy shift for more recent graduates because they already know R. The focus for our training had to be on members of staff who have been with us for longer and weren’t familiar with it. And that can take some persuading – team leaders finding themselves managing teams who are using software they have never used themselves. We had to support and train at all levels (with tasters for managers who themselves may not need to delve into the finer details of R, but know they would know how to do it if they were doing their time again).

So, what have we learnt?

  1. Be prepared to try new ways of working
  2. Listen to your staff, who have a different perspective and fresh take on ways of working
  3. Start small – it’s easy to make the case when you can showcase the benefits of the RAP approach
  4. RAP improves the quality of what we do and eliminates errors
  5. Be prepared to invest in training – and recognise your staff will be in different places
  6. Use buddies – our central transformation team certainly helped with that, creating capacity in teams to RAP their processes
  7. Be open and share your code – we publish our code on GitHub, a great community to share ideas and approaches

Listen to your enthusiasts

OSR’s report highlights that we have a small central transformation team to support teams with their RAP work. This is crucially important as the initial work to RAP your processes can take time, so the additional capacity to support our teams to enable this to happen is a must. This initial investment is worth it for the longer-term gains. It’s not all about making efficiencies either. It’s about streamlining processes, reducing error, and giving our analysts job satisfaction. They are now able to add more value because they have more time to delve into the data and help users understand what the data are telling them.

Our focus on transforming the processes for our existing publications has stalled due to many of our staff being redirected to supporting PHS’s key role in Scotland’s response to the COVID-19 pandemic. However, a success of our RAP work is that many of our new processes required to produce our daily COVID-19 statistics are done using RAP principles – they had to be as we’re producing more statistics, more frequently than ever before. Our earlier use of RAP meant we were in a good place to apply the techniques to our new ways of working.

And my final bit of advice? Listen to your enthusiasts – I’m glad I did.

Transparency is fundamental to trust – the government must learn from mistakes during the pandemic

As part of our work on statistical leadership, we are hosting a series of guest blogs. This blog is from Will Moy, Chief Executive of Full Fact.

 

This time last year, the UK was about to head into what would be the first of three national lockdowns. Already, the news was dominated by the coronavirus pandemic, shining a spotlight on data that shows no sign of fading any time soon.

In the past year, in-depth coverage and debate around the merits of statistical releases have become commonplace, while the nation has watched numerous public briefings filled with data and statistics from politicians and scientists.

As an organisation dedicated to fighting bad information, Full Fact knows just how valuable good information, used responsibly and communicated well, is to public life and public debate. We support the OSR’s vision that statistics should serve the public good.

It has, therefore, been heartening to see the new prominence given to discussions about the importance of data, statistics and evidence during the pandemic, whether among members of the public, in newspaper columns or in public statements from the government.

At the same time, the government needs to work to maintain public confidence in both its decisions and the data it is using to make them, which is a key part of ensuring it can tackle the crisis: this shapes what people are willing to do and how long they are willing to do it for.

We have all been asked to trust that the government is acting in our best interests, as our political leaders point to data and scientific evidence as justification for decisions that have had real and significant impacts on the way we live our lives.

In doing so, the need for a clear commitment to transparency, accountability and honesty in public life could not have been stronger. However, this bar was not always met.

Ministers or officials too often referred to data or statistics in statements to the press, the public or Parliament without publishing the full data so the public and intermediaries could check for themselves. A failure to be transparent on the data and evidence informing the government risks undermining public confidence in fundamental parts of the government’s response.

As fact checkers, we rely on publicly available information to scrutinise such statements. We can and do question departments but time is of the essence when trying to challenge false or misleading claims. By and large, a fact check is at its most valuable when it is published soon after the claim has been made, reducing the time any inaccuracy has to spread, and coming when the media or public are still interested in that topic or debate.

For instance, we were unable to assess claims about the turnaround times of tests made by Prime Minister Boris Johnson on 3 June 2020 until the necessary data was published a month later, on 2 July, and raised this issue with the OSR. Full Fact also found it impossible to assess a government target set out in July 2020 – to “test 150,000 at-risk people without symptoms per day by September” – because the right data wasn’t available at the start of October.

We discuss these examples, and more, in our Full Fact Report 2021, which is based on our experience fact checking the pandemic. In it we make recommendations for action from the government, including that when government departments, ministers or officials refer to data or information when making statements to the public, the media or Parliament, the full data must be made publicly available.

Throughout the pandemic Full Fact has been pleased to work alongside the OSR to help secure publication of missing or unpublished data, as well as challenging misuse of information – and both organisations are committed to continuing this work.

But publication of information in a timely manner should not require our interventions. The government must make a consistent commitment to doing so as soon as is possible, and before being asked.

It is undeniable that there are still many changes – some cultural, some technical, and many long-term – that need to be made within the statistical system. This will require efforts from statisticians at all levels and across all departments, and the OSR’s report into Statistical Leadership proposes a comprehensive set of recommendations towards this.

Crucially, the report highlights the pivotal role of committed leaders inside and outside the Government Statistical Service in ensuring government and society get the most trustworthy, high quality, and high value statistics. The report is in itself important and welcome leadership from the OSR: the recommendations should be promoted and acted on with progress monitored and evaluated.

Full Fact strongly agrees that efforts from statistical leaders, and the wider system, must be equalled by political leaders. It is of critical importance they lead by example, whether that is ensuring that statistical thinking is seen as essential within the civil service, or respecting the principles set out in the Code of Practice for Statistics.

Demonstrating respect for these responsibilities is essential if the public is to have confidence in the production and use of statistics in public life. The government must learn lessons from the pandemic and prove it is worthy of the public’s trust.

 

Talking numbers and making them count

As a communications professional, I use insights and ideas to implement and deliver impactful communications. Using statistics is a great way to use evidence and explain complicated information, but comms people are not always well-known for their statistical literacy and this can sometimes cause problems.

During a busy day in the comms team, any use of numbers in a press release, tweet or presentation should align with the Code, ensuring messages are clear, measured and appropriately tell the story. It is essential that production and use of statistics by governments command confidence in the statistics and organisations using them and help those listening understand the key messages.

At the Office for Statistics Regulation we are interested in how numbers can be used powerfully and collectively across government, to convey important messages and information. Statistical leadership by government is essential to ensure the right data and analysis exist; to ensure they are used at the right time to inform decisions; and to ensure they are communicated clearly and transparently in a way which will support confidence in the data and decisions made on the basis of it.

Statistical leadership is not just about having good leadership of the statistics profession. While this is important, we want to make sure individuals inside and outside the statistics profession show leadership. This should happen right through from the most junior analysts producing statistics to the most senior Ministers quoting statistics in parliament and media. It is relevant to all professions including policy and communications specialists.

Communications teams should work in close partnership with their department’s analysts, to ensure that any use of statistics does not distract from your key communications messages, or itself become the story. The winning situation is using statistics in a helpful way, to convey the right impact, help tell the story, gain understanding and enhance the organisation’s reputation in the process.

The Code of Practice for Statistics and its principles and practices of ‘trustworthiness, quality and value’ provides an excellent guide to ensure this is done as effectively as possible, to ensure users can confidently make decisions about the statistics that are presented to them, using them without question to access what they require and need.

Statistics can really add to public debate as we have seen during the events of COVID-19, when the nation has used numbers to understand the pandemic and its impacts on society, the economy and wider. But it is essential that anyone using numbers and speaking on behalf of government can communicate statistics effectively, in a way that commands confidence and helps those listening understand the key messages. The simplest way to achieve these outcomes and empower your message is to ask the right questions about statistics before you use them. And, if you still feel unsure then find another way to evidence your point.

However, comms people don’t need to know the Code inside out and should always work closely with Heads of Profession for Statistics for advice, support on using numbers and understanding of guiding principles.

If you are interested in finding out more about using statistics, the Code or Statistics Leadership please get in touch with me or visit our website.

Here are some tips…

  • Does it look right? Is that an implausible number? If it’s unusual, it could be wrong… what’s behind the surprise?
  • What exactly are we measuring and why? Is the source reputable and did they show their working?
  • Where do the data come from? What is the backstory and always be curious about someone else’s data. What do we discover if we go another click?
  • Only compare the comparable. Watch out for changes in definitions and different ways to measure the same thing . What’s the historical trend?
  • Presentation is key. Good use of pictures and graphics help convey meaning and should never cause confusion or misrepresentation
  • Remember to ask your Head of Profession for statistics, or a statistician who has worked to produce the data, for advice on how best to present numbers in communications.

I didn’t think I’d ever be interested in population statistics, but then I came to my Census

Ok so that’s not quite true. I’ve been working with a focus on population and society statistics for the past three years here at the Office for Statistics Regulation (OSR) and I love it. Working on areas such as migration, culture, wellbeing… it is fascinating. These statistics form such an important role in people, government and decision makers understanding and planning for our society in this country.

But even with my work hat off for a second, this is such an exciting time for statistics as we near Census day in England, Wales and Northern Ireland and work continues to prepare for Scotland’s Census in 2022. The Census is so important and a unique source of data. It will be used by many different people for many different purposes. For me, I think Census data can be most valuable when it is used to support how people understand their local communities, whether it be by local councils, community groups or even school students.  I am so eager to play my part and fill in my Census return – online this time of course.

Work hat back on and I have been leading OSR’s assessment of the Censuses in the UK. Undertaking this assessment is the role OSR has when it comes to Census data and statistics. We aren’t collecting the returns or producing the data but we are working closely with the Census offices to ensure what they are doing, ultimately delivering Census data and statistics to the public, is in line with the Code of Practice for Statistics.

We collect evidence from Census offices, speak with users of Census data, make our judgements on compliance with the Code; reporting on this through our formal assessment reports and public correspondence with Census offices. That is the nuts and bolts of the assessment process. The reality of the assessment and OSR’s involvement is that we are continuously engaging with Census offices as they are developing and delivering their Census plans to support the best possible outcomes. We meet with Census offices regularly to discuss their work, share our views on their developments, and talk through how they have taken forward our findings. It has kept me busy since 2018 and will continue to do so until well after the data and statistics are published.

This ongoing conversation as part of the assessment is overlaid with a more formal reporting structure. We have completed phase 1 of the assessment and are kicking off phase 2 for England and Wales and Northern Ireland. For each phase of the assessment, the Office for National Statistics (ONS) and the Northern Ireland Statistics and Research Agency (NISRA) publish an accreditation report. These reports provide an update to Census users on how they consider the practices, processes and procedures for Census 2021 meet the standards of the Code. This provides OSR with evidence for our assessment and more importantly provides useful information on progress on the Census for all who are interested. You should definitely take a read!

We are always keen to hear from users of Census data and I have had some extremely valuable conversations with individuals and organisations to date – a big thank you if you have submitted feedback or if we have spoken in the past. As part of this second phase, we have opened up our user consultation once more . Your views and opinions are so important to help us understand what Census offices are doing well and how things could be improved. So please, do find out more and get in touch.

I hope you all are as excited as I am as we get closer to 21 March!

Next stop: National Statistics status

In 2020 we assessed estimates of station usage produced by the Office of Rail and Road (ORR) and designated them as National Statistics in December. In this blog, Lyndsey Melbourne, Head of Profession for Statistics at ORR, and Anna Price, the lead regulator for the assessment, talk about their experience and why assessments of official statistics are so valuable.

Where it all began…

Lyndsey: Most of our statistics were designated as National Statistics in 2012. In 2019 OSR carried out a compliance check and confirmed they continued to uphold the high standards expected. At the time we also discussed future assessments – in particular our most popular set of statistics, estimates of station usage, had never been assessed. These statistics provide unique information about each of the 2,500+ mainline rail stations in Great Britain. The granularity of data is one of the main reasons that these statistics are of interest to a very broad range to users: they are relevant to anyone no matter where they live. We were keen to further promote the quality and value of the statistics by gaining National Statistics status.

Anna: This assessment was a bit different to others. Usually we do our review, publish our findings and requirements, and then give producers a few months to meet these requirements. But when we first met with Lyndsey and Jay, the lead statistician for estimates of station usage, in April 2020 they told us they were keen to get National Statistics status in time for the next statistical release in December.

The assessment process

Anna: To support ORR to achieve this ambition, we adapted our usual process, for example sharing our findings and requirements as we developed them. This let the statisticians at ORR start on improvements while we worked on more complex findings and wrote our report, instead of waiting until the end. We had lots of meetings with the team during the project and were really impressed with the ideas they came up with each time we raised an area for improvement. I think the flexibility and enthusiasm of both teams was the reason that the project was so successful.

Lyndsey: Throughout the assessment, OSR were flexible and happy to work with us to agree timescales to fit in with our publication plans and around our day jobs. We were keen to work towards achieving National Statistics designation of the statistics in time for our next annual publication planned for December 2020. Otherwise, it would be up to 20 months before we could publish designated statistics!

OSR were very accommodating to this request and we worked closely during the following eight months to review and improve our statistics. OSR’s flexible approach allowed emerging requirements from their assessment to be addressed during the production process of the next set of statistics.

It’s fair to say that producing the annual publication at the same time as addressing OSR requirements was a challenge, but being able to confirm to users that our statistics had been successfully designated as National Statistics on publication day was very satisfying.

The value of this assessment

Lyndsey: During the assessment OSR spoke to a range of users and the feedback they obtained was extremely valuable. We have continued to speak to these users to understand their use of our statistics and how they could be improved further.

The improvement plan we developed to address OSR requirements and other feedback from users was a really useful tool for us. Sharing ideas and drafts with OSR along the way and getting their feedback was another valuable part of the process.  We published this improvement plan on the user engagement page of our data portal to keep users up to date on the changes we were making.

Anna: The users of these statistics are passionate about them. So it was a lot of fun to hear about how they use the statistics, what they like and what would make using them even better. Seeing the variety of people who use the statistics, for a variety of purposes, was really motivating – it made it even more satisfying when we saw changes to the statistics in December which met the user needs we had identified.

At OSR we like to champion good practice, as well as areas for improvement. So it was nice to highlight the great work that ORR were already doing on these statistics – like the Twitter Q&A that ORR host on publication day, this year accompanied by a launch video and a live YouTube Q&A. It’s great to see statisticians putting themselves out there to talk about their statistics directly with users.

 

 

To keep up to date with our latest work, you can follow us on Twitter and sign up to our monthly newsletter.

Upskilling Government leaders in data and statistical literacy

As part of our work on statistical leadership, the Office for Statistics Regulation (OSR) is hosting a series of guest blogs. The first in the series is from James Kuht, from the Number 10 Data Science Team, who has been leading development of the Data Masterclass for Senior Leaders.

How data-literate do Senior Leaders in Government need to be?

There was a time when it would be inconceivable to imagine senior Government Ministers and officials addressing the nation accompanied by a series of charts and statistics, but over the course of the pandemic it has become relatively normal. Our Leaders are more reliant on data and statistical literacy than ever before and through the Code of Practice for Statistics have a commitment to demonstrating good use of data.

There has been rising use of data and statistics across Government over recent years; the increasing appetite for evidence-based policy-making (the ‘Nudge Unit’ perhaps being the most public example), compelling visual communications of data (as evidenced by departmental statistical releases awash with colourful charts), and new data science techniques such as machine learning increasing in their use. This shift is deliberate and strongly supported from the centre of Government, perhaps best summarised in Rt. Hon Michael Gove’s Ditchley Lecture, in which he calls on Government “to evaluate data more rigorously […] to evaluate policy successes and delivery failures”, with the help of “data analytics specialists”. His reasons are clear – “it is imperative that we learn the hugely valuable lessons that lie buried in our data”.

But Gove also highlights a hurdle we must first overcome to achieve this – “Government must also ask itself if its people have the skills necessary for the challenges that I have set out”. With many Senior Civil Servants being Humanities Graduates, & the largest source of Graduate-entry Civil Servants continuing to be so – there are many of our Senior Leaders who may not have had much formal training on data or statistics since GCSE’s or O-levels. Given this, we need to ensure that there is suitably high-quality training on data & statistics accessible to our current and future Senior Leaders.

 

What does a Senior Leader in Government need to know?

The first challenge is is to communicate that statistics are accessible and to inspire Senior Leaders with concrete examples of their value. This is no mean feat(!) – data and statistical training can often be intimidating. If it only caters for those who are ‘data-curious’ then it likely alienates the target audience. Focussing on making training relevant and of direct value to the learner is crucial.

The second challenge is to turn Senior Leaders into more effective consumers and commissioners of statistics – since, as Senior Leaders, this is largely their role when it comes to statistics. Effective consumers meaning that they are able to scrutinise the data they are presented, and aren’t hoodwinked by misleading headlines which might misuse a relative risk in place of an absolute risk, to give one common example. Effect commissioners meaning that they ask for the right analysis to support the decisions they make and understand the importance of transparency – making the analysis behind the decisions open when possible, and choosing the right visualisations to communicate their messages.

Crucially, third we need to improve the interface between Senior Leaders & analytical professionals. Senior Leaders don’t need to be able to produce complex analysis nor deeply critique a scientific paper, but they do need to be able to ask the right questions and request the correct support from Government Analysts who can.

The fourth and final objective is to give Senior Leaders an awareness of the opportunities and risks presented by new data science techniques, such as machine learning, which are undoubtedly having an increased impact on work across Government.

 

How do you teach Senior Leaders these skills?

Recently we have worked with stakeholders across Government, such as ONS’s Data Science Campus, the Government Digital Service, the Government Statistical Service and the Behavioural Insights Team to create an offering which addresses the challenges outlined above, a ‘Data Masterclass for Senior Leaders’.

We chose to make it available to Senior Leaders within Government as a 5-10 hour online course so that we could achieve maximum scale, whilst also protecting the health of participants who are largely, of course, working from home at the moment.

The Masterclass comprises a mix of 10-15 minute engaging talks from some of the UK’s leading experts on data & statistics such as Sir David Spiegelhalter, Dr Hannah Fry, & Sir Ian Diamond, with compelling data-exemplary case studies from across Government, to really bring the concepts to life. Feedback so far has been strongly positive from the 76 Senior Civil Servants who have completed the course, with an average rating of 95%, and comments like the below reassuring us that we’ve at least succeeded in achieving objective one of the above!

“A really fabulous course. Accessible and relevant – you managed to bring what could have been quite a dry issue to life, and make it not too complicated for the data-challenged”

Going forward, we’re looking forward to scaling the Data Masterclass in partnership with the ONS Data Science Campus and making it available to every Senior Civil Servant. We hope this upskilling supports the brilliant work of many stakeholders in Government’s use of data and statistics, so that we can derive the “hugely valuable lessons” that lie within our data, in a way that is responsible and respects the privacy of UK citizens.

Can we make trust go viral? Increasing public trust in a digital age

It’s roughly a year ago that I gave evidence to the House of Lords Democracy and Digital Technologies Committee.

A lot has happened in a year. I did things at the evidence session that don’t happen now. The Committee session took place in person, not over Zoom. I shook hands with the other witnesses. Afterwards I chatted with several people, including the Committee Chair, Lord Puttnam, and gave no thought as to whether we were two metres apart.

But looking back on the Lords report, published in June last year, it’s clear that the report remains highly relevant.

The Committee was grappling with one of the challenges of our age: whether or not norms of democracy and democratic debate are under threat from digital technologies, social media and so on. In particular, if it is true that false information can circulate more freely in social media than they did in the past, does that erode one of the bedrocks of democracy? Lord Puttnam’s foreword vividly describes this as another kind of virus:

“Our Committee is delivering this Report to Parliament in the middle of an unprecedented health and consequential economic crisis. But our Report focuses on a different form of crisis, one with roots that extend far deeper, and are likely to last far longer than COVID-19. This is a virus that affects all of us in the UK – a pandemic of ‘misinformation’ and ‘disinformation’. If allowed to flourish these counterfeit truths will result in the collapse of public trust, and without trust democracy as we know it will simply decline into irrelevance.” Foreword by Lord Puttnam, Chair of the Committee

Empowering citizens

The Committee’s report made a range of thoughtful recommendations. A lot of them cover legislative and regulatory change, around how digital platforms operate and electoral laws should adapt to the digital era.

And they align well with the principles that underpin our work at the Office for Statistics Regulation (OSR) and the Code of Practice for Statistics.

This is because the report highlights the importance of information that serves the public good. This includes empowering citizens:

Alongside establishing rules in the online world, we must also empower citizens, young and old, to take part as critical users of information. We need to create a programme of lifelong education that will equip people with the skills they need to be active citizens.

There is growing interest in this notion of digital or statistical literacy. The recent ESCoE report on understanding economic concepts has grown awareness of issues around public understanding and engagement in economic matters. Meanwhile, OSR’s own review of literature on the public good of statistics found that a lack of statistical literacy can lead to misunderstanding or misuse of statistics, and susceptibility to misinformation.

The report also touches on the public’s need for high quality information. The report places journalism at the centre of this: “The public needs to have access to high quality public interest journalism to help inform them about current events.”

We agree with the crucial role of journalism. But journalism is often playing the role of intermediary between Government and citizens. And if citizens should be equipped to interpret information, so too the Government has a crucial role to play in providing clear, accessible, high quality information.

Statistics that serve the public good

The pandemic has shown this repeatedly. The UK’s four governments have put an enormous emphasis on providing accessible data on cases, testing, deaths, hospitalisations, and now vaccines. All four governments have improved the range and depth of data available, and all have sought to comply with the Code of Practice for Statistics. There have been occasions where we’ve pointed out ways in which the information can be better presented, or more information made available. But that’s against the backdrop of commitments to inform the public. And it’s clear that this information is used and valued by the public.

Data and evidence has remained prominent for Parliament during the pandemic. The Commons Science and Technology Committee report recently published a strong report on The UK response to covid-19: use of scientific advice. And the Public Administration and Constitutional Affairs Committee is currently undertaking an inquiry into the transparency and accountability of Covid-19. Both of these Committees have drawn on our work at OSR to emphasise the importance of presenting data well. The Science and Technology Committee summarised the issues thus:

As the Office for Statistics Regulation advised, in order to maintain high levels of confidence, data and statistics should be presented in ways that align with high standards of clarity and rigour—especially when they are used to support measures of great public impact.” Key finding 6, Science and Technology Committee

The Government’s role in informing the public

My reflection, then, looking back over the last year and at the Lords report, is how important the Government’s role is in informing the public.

As I said to the Lords Committee:

What keeps me awake at night is that so many official statistics have the potential to be valuable assets, not just for the policy elites, but for the public more broadly. But that potential is unrealised, because they are presented and communicated in a dry and almost mechanical way: we did this survey and got these results. They are not presented in a way that engages and links into the things that people are concerned about. I worry much more about that than I do about the more pervasive concerns people have about misinformation. I worry about the good information not getting out there. Q93, Select Committee on Democracy and Digital Technologies

The enduring lesson of the last 12 months is this: providing information that is trustworthy, high quality and high value to the public is a crucial function of Government.