Volunteering at COP26 – being part of something big

I felt immensely excited and proud to be one of the 1,000 volunteers helping at COP26 in my home city of Glasgow. Volunteers came from different corners of the globe, to perform a range of tasks. I was a shuttle hub volunteer which means I helped those going to COP26 get from there and back by bus shuttle from the key transport hubs – over 7,000 people on one day in the hours I was there.

What gets me out of bed in the morning is feeling like being part of a greater whole. COP26 makes me realise that climate change and net zero is the bigger picture. As VIPs with their outriders whizzed by me, I was focussed on people keen to get their day going, like the lady who travelled for days from the islands of Palau or the man from Samoa to make sure they get their voices heard alongside the more well-known world leaders. The Volunteers logged the peoples from different parts of the world that they’ve welcomed to Glasgow – now covering almost all the 196 nations at COP26.

Glaswegians had varied responses to COP26. Most saw it as a great opportunity for the world. Some reflected the views of Greta Thunberg, that it’s more about “blah blah blah” than substance. Some bemoaned the inconvenience to their daily routine, although Glaswegians are renowned for their humour and make a joke of any temporary disruption to their day.

Coming from a mining family, I started my career in the coal industry, and was enormously proud to work in a nationalised industry. A phrase I heard a lot back then was that Britain was an island founded on coal surrounded by fish.

I was one of 20 young graduates that joined the National Coal Board (NCB) in 1979 and many, like me, came from coalfield communities. Early in 1980, I was sent on behalf of the NCB’s Economics Intelligence Unit, to monitor the Select Committee on Energy at the House of Commons to feed back on its inquiry into alternative sources of energy. I reported back that I didn’t see renewables taking off or being any competition to coal. I was deeply attached to coal as a nationalised resource. The prevailing economic thinking at the time was the mixed economy – nationalised industries, public corporations and private enterprise.

I so wish now that I could have heard statistics about the threat to the planet (those statistics came much later) and the imperative of reducing our dependency on fossil fuels and ramping up clean alternatives. At that time in the early 1980s the environmental problem of burning 120m tonnes of coal in the UK was making coal a little cleaner. I was astonished to learn that more coal is produced in the world today (~7.6 billion tonnes) than early 1980 (~3.8 billion tonnes), according to statistics from the International Energy Agency.

Climate change statistics are important, but change will rely on leadership

During these weeks at COP26, I saw and heard the leaders and politicians on the TV radio and through the media. As I spoke with those going to COP26, I heard their hopes and fears. I realised that climate change is more than the science; political leadership is crucial. Leaders need to be skilled in the appropriate use of data as part of their role in persuading the public to make the behavioural changes and embrace the consequences. E M Forster said “Only Connect!…Only connect the prose and the passion and both will be exalted”.  Leaders connect the prose and the passion and need to skilfully deploy the statistics. Leaders can give statistics a social life.

As we’ve learned from the Covid-19 pandemic, the role of senior leaders in government is also vital in providing support to statisticians. The lessons learned throughout the pandemic should show us how to better help people to understand the choices open to us all to respond to the challenges of net zero.

Statistics play a key role in helping the public understand the bigger picture on climate change. They are also essential for helping governments design and monitor policies that reduce or prevent greenhouse gas emissions (mitigation) and prepare us for the expected impacts of climate change (adaptation). Because it’s such a complex topic, climate change statistics add insight when related sets of statistics are brought together to tell a clear story of what is happening and why. They also add value when they are presented in ways that help different types of users understand what the data is telling us, and when users can access the information they need. Our recent review of the UK’s climate change statistics looked at exactly these two things: their coherence and accessibility.

I’m nostalgic for people in Britain as a society having a concept of being part of a greater whole, that realises we’re dependent on each other. We need to connect with each other locally and globally and we’ll appreciate being part of something much bigger than ourselves.

 

 

Rising to the challenge

One thing that has stood out in the Coronavirus pandemic is how quickly the Government Statistical System (GSS) has responded to the need for data to support important decisions made by the government during this timeIn just a matter of weeks, statisticians have repurposed existing statistics and have made use of new data sources. Before the crisis, this work might have taken months. HMRC’s Coronavirus Job Retention Scheme (CJRS) statistics and its Self-Employed Income Support Scheme statistics (SEISS) are among theseWrecently conducted a rapid review of these statistics and Ed Humpherson, Director General for Regulation, has written to HMRC’s Head of Profession for Statistics, Sean Whellamssupporting the approach taken to produce these statistics. 

In March this year, we wrote about how our assessment of employment and jobs statistics in effect captured the statistical world in miniature, microcosm, of the statistical system. The assessment surfaced many of the common issues that statistics producers face, highlighting recurring themes from our other regulatory work. Now we see a further glimpse of our statistical world in miniature, through the lens of our recent review of HMRC’s statistics. HMRC’s response to the need for information about these crucial schemes admirably demonstrates government statisticians rising to the key challenge of the times.  

There are two aspects of the statistical system which these statistics exemplify.

First, for us as statistical regulators, whether during a national crisis or otherwise, government statistics should (ianswer society’s questions in a timely way; and (ii) provide insights at a level of detail to be useful to users. Additionally, many questions cannot be answered without sharing and linking data. In the preparation, production and publishing of the CJRS and SEISS statistics, HMRC has displayed all of these elements.  

Naturally society’s questions have been about what the take-up of the schemes and the costs of the job protection schemes to the public purse. These interests reflect two essential aspects of the Government’s job protection schemes  speed and simplicity of support. CJRS was launched on 20 April and SEISS on 13 May. Initially HMRC tweeted daily information on both schemes – in CJRS information about the number of unique applicants and the number of furloughed jobs In SEISS, HMRC gave information about the total number of claims. HMRC tweeted also the value of the claims received in respect to both schemes. As claims to the schemes started to tail off HMRC moved to tweeting the data on a weekly basis. Releasing these statistics on a timely basis and at intervals that met the needs of users is a good example of the orderly release of statistics essential to building trust in new statistics. 

After just a few weeks of tweeting this important data, HMRC linked both the CJRS and the SEISS data with other pre-existing HMRC administrative data to provide further insights into CJRS claims by employer size (for SEISS breakdown of claims by age and gender)and breakdowns for both schemes by sector of the economy, and by geography. These new statistical breakdowns were published in statistical bulletins released less than two months after the launch of the CJRS and under a month after the launch of SEISS  quite  remarkable achievements.  

Second, we found HMRC to be working closely with users of this data to find out what they need to know from the statistics. HMRC is open with users about any aspects of uncertainty in its estimates labelling the statistics and analysis with a frank description of what the statistics summarise. Consistent and coherent standard geographic definitions are adopted to harmonise with related statistics and data. In explaining the methods used to produce the statistics, HMRC has been proportionate to the complexity of the methods themselves, reflecting the needs of different types of users and uses. 

In a further example of a statistical system working well, we as statistical regulators look for statistical producers to go beyond making statistics and data available fast but to also present clear, meaningful, and authoritative insights that serve the public good. There are many examples of how HMRC has done this but to select just one in the SEISS bulletin HMRC set out the numbers of self-employed people across the UK who are eligible for support. This information is published not only at the UK level but also iaccompanying tables at regional and sub-regional (local authority and parliamentary constituency) levels. As we pointed out in our Monitoring Review of Income and Earnings in 2015, timely data on self-employment income is a key data gap generally. These statistics help understand just a little bit more about the income and earnings of the self-employed at different locations around the UK in 2020 and are a step towards addressing this data gap. 

Normally when statistics producers publish new or innovative statistics, they have a period in which they can develop and improve the statistics. HMRC will only have a restricted period in which to do this for its CJRS and SEISS statisticsBy their nature, these schemes are temporary and we will probably only see small number of releases.  Quite how the statistics may change, for example with the advent of the Coronavirus Job Retention Bonus Scheme, is yet to be established. Society will pose further questions for statistics producers to answer. 

The times we live in call for an ongoing watchfulness, a continuing need to be agile and responsive to offer even further insight in the months and years going forward. What we found in this case study is replicated throughout the statistical system. With continuing watchfulness there’s little doubt that there will be further statistics, data and insights to help manage to the other side of this pandemic. 

Employment growth statistics: a case study in curiosity-driven quality improvement

Albert Einstein was reputed to have said that curiosity has its own reason for existing and the important thing is not to stop questioning. This blog is a case study example of the value of indulging curiosity to help strengthen the quality of valuable official statistics- and in this blog we look at ONS’s Employees in the UK statistics from Business Register and Employment Survey (BRES) survey data. Professor Sir Charlie Bean in his 2016  Independent Review of UK economic statistics, reflected that to better understand the modern economy and people’s lived experiences of the economy, ONS statisticians should be more curious about their statistics and what stories they are telling. He set out three inter-linked ingredients that are needed to help meet this objective of building a ‘curious’ ONS that is more responsive to changes in the economic environment and better meets evolving user needs.

Three inter-linked ingredients of building a more curious ONS

Chart 1 Three inter-linked ingredients of building a more curious ONS: Strengthen QA, improve appreciation of the statistics in use, raising staff knowledge.

Source: Independent Review of UK economic statistics, Sir Charles Bean

 

 

In early summer of 2018, Cambridge Ahead (CA), a membership organisation involving people and organisations dedicated to the successful growth of the Greater Cambridgeshire region, contacted both OSR (the Office for Statistics Regulation) and ONS. CA drew our attention to their concerns about a lack of alignment between employment growth estimates for the region from its own database compared to those from ONS’s BRES-sourced statistics as exemplified in Chart 2 below: Source: Analysis provided by CBR based on latest data

Ed Humpherson, Director General, Office for Statistics Regulation.

Chart 2 showing Average employment growth over six year 2012-2018 comparing BRES and CA estimates.

After engaging with ONS and CA we felt that engagement between the two organisations provided an opportunity for ONS to pursue a potentially new way of testing the quality of the BRES estimates. The purpose of this blog is not to go into the detail about the reasons why the estimates don’t align, we are more interested in how the curiosity could lead to improving the insight and quality of these employment statistics.

We saw that ONS and CA, exchanged details of their methods and details about their data. Both appreciated the strengths of the other – for example BRES offers labour market data at low levels of geography and CA’s analysis introduces a greater understanding of the employment market in this region.

ONS has described BRES statistics as providing a comprehensive picture of jobs in the UK but recognise that these statistics, like most derived from sample surveys, display some limitations and may not immediately pick up new businesses. ONS intends through its transformation programme to source employment numbers from PAYE income tax data. This offers the potential to give ONS close to real-time employment information from every business with a PAYE scheme, including newer businesses. This would provide a wealth of new information that ONS can use to provide even more detail and more timely figures for statistics users.

While this is exciting news and very much to be welcomed, developments like these can take time to implement. We feel there are more immediate opportunities to strengthen the insight that these statistics offer to users as well as quality assurance of the source data.

CA believe that they have a comparative advantage compared with ONS in capturing smaller companies, which don’t get picked up on the IDBR (Inter-Departmental Business Register – the sampling frame for the ONS data) due to such businesses not being registered for VAT or PAYE purposes. In an area with a lot of small start-up companies, this could be a significant cause of differences.  The sample allocation for BRES was last reviewed 10 years ago. Sample re-allocation is important because of changes in the economy over time; the existing allocation may not reflect the current structure of the labour market. In the OSR’s assessment of ONS’s Labour Markets Statistics we have required ONS to review and update the sample allocation (our Requirement 7a). We expect ONS to have acted upon our requirement by March 2021 with a formal update on a quarterly basis. In the meantime, ONS should publish an action plan which sets out its proposals for addressing the Requirements.

What has been instructive about ONS opening its data to scrutiny and challenge is that the engagement progressed beyond an attempt merely to reconcile differences in estimates. The challenge posed by the alternative estimates has been the impetus to check out the reasons why and question whether the methods and current checks used to produce BRES estimates are adequate. The engagement has been successful in that it has been a key input to some of the requirements that the OSR has made in its assessment of ONS’s labour market statistics leading to better employment statistics not just for one region but for all regions throughout the UK.

ONS is keen to continue to engage with CA to help better understand the labour market in Cambridge and Peterborough. This curiosity work between ONS and CA underscores the importance of covering all three of Sir Charlie Bean’s ingredients of curiosity – improving the appreciation of the statistics in use, raising staff’s knowledge and strengthening the quality assurance. This was always an exercise in taking advantage of additional data to see whether official statistics could be improved. In this respect curiosity has proved that it has a very important reason to exist.

“An investment in knowledge pays the best interest” – Steps towards transparent fiscal statistics

Iain Russell, OSR theme lead for Economics, blogs on his recent regulatory work looking at the public value of devolved public finance statistics. This work also looked at the extent to which devolved investment statistics are seen as a priority for development in the UK’s devolved countries.

***

“An investment in knowledge pays the best interest” – Benjamin Franklin

***

“The field of behavioral economics analyzes imperfections in market decision-making, but the biggest practical problems often involve our inaccurate perceptions of what the public sector is up to and how much it will affect us.” – Tyler Cowen, New York Times

 

What have we been doing and why?

Last year, the OSR started thinking about the public value of statistics on devolved public finances, focusing on transparency and coherence.  These statistics play a central role in public debate but getting accurate perceptions of what the public sector is up to and how much it affects us can be challenging. We have captured our thoughts about the public value of these statistics in different ways – for example through two presentations we prepared about the two phases of our work, through a YouTube interview and in a regulatory report into our assessment of some of the key source statistics on public spending in the countries and regions of the UK from HM Treasury. All of these have been published today.

Mention the word ‘statistics’ in social settings and the MEGO (my eyes glaze over) factor starts to set in. Follow that up by dropping into conversation terms such as ‘tax revenues’, ‘spending data’ and ‘budgets’ and people are heading for the hills.  So, on the face of it a blog about regulatory work on looking at the prospects for improving the transparency and coherence of statistics on tax, spend and investment statistics in devolved countries and regions may not initially sound that inviting. But we have seen just how important it is to people that government statistics clearly tell people what the public sector is up to and how much it affects us. Investments in transport infrastructure, replacing EU spending post-Brexit, student loans – these are all areas that affect people’s lives and those of future generations.

Whilst the UK’s and the devolved countries’ public finance data are regarded as among the world’s most transparent, on occasion those publishing the data can be victims of their own success as there is so much data it can be difficult to see the signals from the noise.

It’s a great time to take considered steps forward in our devolved public finance statistics.

New fiscal frameworks in Scotland and Wales present new challenges to communicate transparently to different audiences, mindful of the sensitive nature of the topic matter. Those producing both statistics and other financial data are very receptive to presenting their data in ways that are more meaningful to citizens against the priorities in the respective countries. Producing statistics in this space is more than churning out data -it’s a contact sport where the engagement with different communities is crucial to gaining trust and moving public discourse from method to meaning. The timescales for this are medium term at least so official bodies need to step up engagement beyond government policy users to a wider range of potential users in think tanks, academia, voluntary bodies, fiscal analysts, offices for data analytics and the media.

Minimising harms, championing high standards.

People need to be able to use statistics with confidence, probably even more so when the topic area of the statistics stirs up strong views. Policy makers and users outside Government need reliable information so that their policy decisions are sound. Citizens increasingly expect to ‘see themselves in the statistics’ and have confidence that the statistics describing their experience of society are produced in a trustworthy way. But these ambitions are not always fulfilled. The public value of statistics can be harmed – for example through lack of insights arising from increasing volumes of data and poor coherence between data from different sources leading to confusion. Value can also be undermined through misuse. Our role in the Office for Statistics Regulation (OSR) as regulator of government statistics is to minimize these harms. By minimizing harms, and championing high standards, we uphold public confidence in statistics that serve the public good.

Devolved public finances – a case of don’t judge the book by the cover.

People in the UK’s devolved countries and in English regions interested in economic development can become quite animated by decisions around spending and taxation relating to their country or their region. Here are some examples:

  • “The Treasury swallows up our pocket money, and when reluctantly it passes on part of it we [the people of Wales] are told to be grateful for such generosity” – ‘The Welsh Budget’, Phil Williams, 1998
  • “Northern Ireland is officially one of UK’s costliest regions as fiscal deficit hits £9bn…Treasury bemused as Stormont continues to spend beyond its means…” – Belfast Telegraph, 2014
  • “The 2017 GERS report should be published shortly..stand by for the incoming GERS denial as the SNP and Snats go into a frenzy attempting to explain away the deficit and create fantasies why Barnett doesn’t matter” UK Union Voice blog, August 2017
  • Devolved public finances data gives rise to vigorous debate not only in UK countries and regions for example, economic considerations are thought of as playing a central role in the case for Catalan independence.

Why are devolved public finance statistics attract important to people?

People can be distrustful that the government of the day won’t try to hide uncomfortable data. Their awareness of this has been heightened by not only private sector accounting scandals but by the many news stories about off-book accounting and what liabilities the public sector has.

Constitutional change is a live topic in some of the UK’s devolved countries where funding and spending statistics often get used to further the arguments for or against change. The nature of public discourse can be about the source data and the methods used to calculate statistics. Apportionment of spending and revenues raised, are sometimes described as arbitrary. The terms used to describe a country’s or region’s financial position (‘fiscal surpluses’ and ‘fiscal deficits)’ can sometimes hook defensive headlines and rebuttals.

What issues did our work highlight?

People see evidence of not only significant differences between the UK’s countries and regions, but sometimes growing divergence. The Northern Powerhouse initiative underscores the need to address the needs of all parts of the UK according to their needs. Funding and fairness represent one of the liveliest debates in public discourse. So far from a dry topic, devolved public finance statistics are worthy of our attention and investment in improvement.

Statisticians understandably need to tread a careful path.

Even very carefully prepared devolved public finance statistics can generate volumes of parliamentary and assembly questions. Possibly as partially due to the difficulties of treading the careful line, we found that statistics around devolved tax and spend are data-heavy. Statistical commentary focuses mainly around changes to methods or data. Insight is left to external information-brokers in think tanks and academic institutions.

Moving public discourse forward.

The public at large can’t assess whether Northern Ireland’s £9bn ‘deficit’ is good, bad or indifferent. The term ‘deficit’ is value-laden. What would be useful to know is the sources and applications of funds, with the application of funds going beyond current spending and examining the assets and liabilities of the UK’s countries in proportion to their respective GDP. We urge that consideration be given to moving to presentations of these statistics which encourage discussions about sustainable funding and spending against needs rather than deficits and surpluses.

Might we learn lessons from other areas of statistics?

Public concern over hospital patient safety has moved crude hospital death statistics to sophisticated methods of calculating and presenting standardised hospital mortality data. Different hospitals face different circumstances which are accounted for in periodically comparing actual hospital deaths to the number of expected deaths. The resulting ratio contains valuable information. Might moving devolved public finance reporting to also reporting on a standardised basis assist in our understanding of the efficiency of the fiscal transfers between different parts of the UK to meet the significantly different needs of the various countries and regions?

The statistics are now more-worthy of people’s trust.

It’s difficult to move some public debates beyond arguments about the measurement methods to the interpretation of the data. There are trust issues to overcome first. For many years after the first publication of the Government Expenditure and Revenue Scotland (GERS) data, as in Northern Ireland, a lot of discussion after publication of the statistics arose from suspicions over the methods used to calculate the figures. In 2018 more than 10 years since the first GERS publication, a whole-page spread in one of Scotland’s national daily newspapers was given over to discussing the figures and what they meant.

Building trustworthiness through engagement leads to developing new dissemination channels and greater insights.

Out of this work we recommend that, as statistics producers build the trustworthiness of the statistics, they focus on developing the public value of the statistics by thinking about how useful insights based on the statistics can be disseminated. They might, for example, engage with analytical bodies who could publish think pieces using the devolved public finance data.  This could attract debate and discussion around topics related to devolved public finances. We recommend for example that statistics producers consider new platforms for disseminating and discussing the public finance data, inviting methodological challenge, encouraging new uses and marrying up the data so there is a consistency between statistical data and budget data.

We have been encouraged by the innovation that we are seeing particularly in presenting budget information and how funds are being spent. We wish all those producing these statistics well in rising to the challenges of reaching the next milestones.