The Code, The Key and (for fans of 90s dance music) The Secret

In our latest guest blog, Paul Matthews, Head of Profession for Statistics in Scottish Government, responsible for capability and capacity of the statistics profession, talks about his passion for improvement and how the system can make the statistics that are produced better and have more impact. This blog coincides with the closing of our consultation on proposed changes to the Code of Practice for Statistics, for which we plan to present our findings in the coming months.

I hear a lot of myths about the Code of Practice for Statistics. I hear things like:

  • ‘I know [insert topic here] is very relevant at the moment, but we haven’t preannounced so we can’t publish for at least 4 weeks, because that’s what the Code says’, or
  • ‘We will have issues with the Code of Practice and trustworthiness if we break a time series’, or
  • ‘We need to publish this as a management information release because the Code won’t allow us to publish as official statistics due to quality’.

In these examples, we are thinking of the Code as telling us what we can’t do. I’m not sure why that is. Maybe we tend to think of it as the rule book that we must obey. Maybe it’s because having the ‘rules’ is comforting for us as statistics producers and can give defence if we are challenged.

A key, not a lock

Rather than seeing the Code as telling us what we can’t do, I see it as an enabler to tell us what we can. In other words, it is a key that facilitates the practical release of statistics that provide value for society rather than a lock that prevents us from being responsive and innovative. And this is equally true for the existing version of the Code of Practice and the draft Code 3.0.

Thinking of the Code as a key isn’t carte blanche for us to do whatever we want. There are still risks we need to work through. But in my experience, the Code tends to be supportive of sensible pragmatic things for users that help build trust and transparency rather than being about protocol for protocol’s sake.

Using the Code as a key

I spent a lot of time looking at the Code of Practice when I developed statistical strategic priorities for the Scottish Government Statistics Group. The priorities are about how we can improve statistical work to focus on what provides the greatest value in producing statistics for the public good. It means that there are things we will need to deprioritise given our finite resources.

Lots in this is informed by the enabling nature of the Code. For example:

  • Using user engagement to help inform what users want, what we can discontinue or deprioritise, and being transparent with analysis plans to convey what we’re doing.
  • Greater clarity and impact of communications to enable publications to be focused and streamlined.
  • Greater use of data sources where timeliness trades off against accuracy, or greater use of granular-level data where appropriate to provide useful new analysis that is fit for purpose for users’ needs.

We have had great support and advocacy for what we’re trying to do in Scotland from everyone in OSR, and it gives us confidence that how we’re innovating is in line with how the Code was designed. As Ed Humpherson said in his response to us on the priorities:

“We support your approach and there are several features that we regard as best practice, including the identification and communication of priorities for each analytical area; the involvement of users; and the openness about the potential for suspensions or changes to some of your current outputs… we certainly would not want to require you to keep all of your current range of statistical outputs if they were no longer aligning with user need”.

When Code 3.0 is finalised, all statistics producers should read it carefully and use it as a key to enable opportunities in the statistics they produce.

That’s the secret, after all!

Embracing Challenge for Change

Hear about the Office for Environmental Protection’s journey of applying Trustworthiness, Quality and Value (TQV) from Darren Watson, their Principal Environmental Analyst, in this guest blog on embracing the Code of Practice for Statistics

How would a new organisation understand, and most importantly communicate, how – or indeed whether – the government is improving the natural environment? How does it make sense not only of the state of the environment but also the many policies and strategies across government that affect it?

These are the challenges that the Office for Environmental Protection (OEP) has had to confront since its creation a little over 3 years ago.

One way to examine progress is through statistics – in our case, using data such as the area of woodland, or the number of water pollution incidents to present trends in key environmental indicators.

Deciding what to measure and what to present in a concise, understandable way that provides value to stakeholders, and ultimately contributes to improving the environment, is tricky. The difficulty lies in the range of policies and strategies in place, government targets and commitments, the numerous sources of pollution and their wide-ranging impacts, and the priorities and concerns of our stakeholders. Just listing some of our stakeholders – parliamentarians and policy makers, policy influencers, the media and the public – has been enough to make my head spin.

The challenge of measuring progress was evident in our first annual report on government progress in improving the environment, which we laid before Parliament in January 2023. At that stage we presented 32 indicators to measure the environment in England.

The work doesn’t stop there, however. Our progress reports need to evolve in response to the environment and policy landscapes, so we can never stand still. Our team therefore challenges itself to continually improve our assessment, to provide greater value for our users and the environment, and to respond to stakeholder feedback.

So, while we use others’ statistics (from bodies like the Environment Agency), rather than produce them ourselves, we are committed to applying the same high standards to our analyses.

As such, we decided to voluntarily adopt the Code of Practice for Statistics when developing our second progress report. The Code sets the standards to which producers of accredited official statistics (previously called ‘National Statistics’) are held. It is, in short, the best way to hold ourselves and our assessment to account, through striving to meet its three pillars: Trustworthiness, Quality and Value (TQV).

The most visible innovation in our application of TQV was our first Methodological Statement, published alongside our second progress report. At 95 pages, it was not lacking detail. But producing a report is the easy part; it is then how it is used, and the value it provides, that is most important.

So, a particularly proud moment for the team that produced the Methodological Statement came when our chair used it at the House of Lords Environment and Climate Change Committee to demonstrate the robustness of our recommendations. A tick in the box for trustworthiness and transparency.

But there is another example of our more fundamental use of TQV and its true value. And this goes back to our indicators.

Because the environment, the factors affecting it, our stakeholder needs and government are dynamic, our indicators must be too. They must adapt, and this is where using TQV – particularly the Q and the V – is key.

Our assessment process does not stop with the publication of our latest progress report every January. Following publication, we take stock of our progress and review and consult on our assessments. We also review those indicators, which is where the lenses of Quality and Value give us an ideal framework through which to challenge ourselves and be challenged by our stakeholders.

For Value, we ask ourselves:

Are our indicators still relevant to users? Do they, as a whole and individually, provide clarity and insight without making our assessment inaccessible? And do they improve and enhance our understanding, and our users’ understanding, of the environment?

For Quality, we consider:

Are we still using the most-suitable data sources? Have the data producers changed anything or stopped updating their statistics? Are our methods still sound and reflective of best practice? And how can we improve our data handling and quality assurance?

This is quite a list. But practically, the QV challenge has enabled the evolution of our indicators, with 23 new indicators included and 16 amendments made to existing indicators through our second and third reports. This work is driven by our vision to better understand those numerous factors that determine progress in improving the environment and to provide greater value and quality for our users. We hope this effort is demonstrated through the increasing awareness and influence of our assessments and their recommendations.

Our commitment to TQV and continuous improvement goes beyond this work. We are using TQV to examine how we assess trends and whether more statistically robust methods are available. We are also building an improved data system to increase the accuracy, quality and speed of our trend assessments, whilst supporting our ambitions to present data more effectively, including using maps.

So, to conclude: the OEP aims to be a trusted voice, and we are committed to being an evidence-led organisation that holds government and other public authorities to account. What better way to support this than to adopt TQV and show the users of our reports the effort we take to meet those aims?

Oh, and please take a look at our reports – we really do value feedback.

Beyond GDP: Redefining Economic Progress

Gross domestic product (GDP) is invariably a focus of debate about the state of the economy and whether this country, or any other, is making progress.

Yet it has its critics. Some complain that the focus on GDP as the single measure of progress distorts our priorities. They argue that GDP blinds us to many other important ways in which society should flourish.

There are indeed good grounds for thinking GDP is an incomplete measure of societal progress. Case in point: it has up to now omitted the depletion of the natural world, though this will be addressed in the upcoming updates to the international standards for the compilation of GDP – the new System of National Accounts WS.6 covers depletion.

Moreover, GDP does not capture lots of types of worthwhile activity that are not paid for (caring for a relative, for example) but does capture activities that are not worthwhile (like trading in illicit drugs).

Over 2024, we saw the maturing of several endeavours to address this issue. They fall into two camps. The first involves using the framework of GDP (or national accounts, to be more precise) to create a more comprehensive measure of growth and wealth. The second looks to develop adjacent measures that focus more directly on well-being and prosperity.

Let’s begin with approaches that focus on GDP.

One solution to this problem is to enrich the idea of GDP – to measure more things within the concept of the economy, like income, capital and growth. For example, GDP could measure the value of work done in the house and could incorporate the depletion of the natural world. A recent speech by the UK Statistics Authority Chair, Sir Robert Chote, highlights international work to widen what is captured as capital in economic measurement, and in particular to include natural capital.

A good example of an attempt to enrich GDP is the ONS’s recent inclusive income release. It supplements the standard GDP measures of output with measures of unpaid work, the costs of depleting the natural world and some elements of cultural wealth. The ONS summarises it well:

“Inclusive income estimates provide a broader measure of the economic welfare of the UK population. They reflect the economic value of both paid activity, included in gross domestic product (GDP), and unpaid activity, which includes ecosystem services and unpaid household services. The result is measures of economic progress that include activity and assets beyond those currently included in GDP.”

It’s an interesting endeavour, and provides some notable insights. For example, unlike GDP per person, inclusive income per person has not yet returned to its pandemic peak. In short, I applaud the ONS’s ambition in taking on this difficult and methodologically challenging work – though it has initiated a lot of debate within OSR, which my colleague Jonathan Price will highlight in a subsequent blog.

The second approach suggests that we should keep our focus on GDP more or less as it is (subject to the usual improvements and use of better data sources, as well as the communication of uncertainty; see our GDP review on this). And instead of extending the framework, it proposes supplementing it with meaningful alternative measures, including personal well-being measures, which focus on the things that GDP does not capture well. The ONS in fact provides an important foundation for these alternatives with its Measures of National Wellbeing.

A great example of the personal well-being approach is provided by Pro Bono Economics (PBE)’s recent report on the state of well-being in the UK, which estimates the number of people in the UK with low well-being. The report highlights what can only be described as a crisis of low well-being in the UK. (And full disclosure: I am a trustee of PBE).

The PBE report is not the only work that focuses on non-GDP measures of well-being:

  • Along similar lines, the BeeWell project has proposed measuring children’s well-being, and has been implemented in Manchester, Hampshire and the Isle of Wight.
  • Carnegie UK’s Life in the UK provides a comprehensive overview of well-being. It extends the analysis from personal well-being to broader societal well-being, including perceptions of democratic health.
  • Complementing this UK-level perspective, the Global Prosperity Institute’s work is also noteworthy. It is more granular and micro, considering the prosperity of small areas using a citizen research approach. Its application to areas of East London is rich in insights into the experience of redevelopment.

What these various outputs show is that the “Beyond GDP” space is maturing. The ONS is doing some thoughtful, innovative things to extend the framework of national accounts. And a plethora of independent approaches are emerging.

So I begin this year optimistically.

Could 2025 be the year Beyond GDP moves from being a slogan to a reality for policymakers, Parliament, media, and, most importantly, citizens?

 

 

Analyse your potential: Internship and placement opportunities at OSR

There are many internship and student placement opportunities available throughout the Civil Service, and in the Office for Statistics Regulation (OSR) we try to give our interns a comprehensive, rounded experience to add to their skills and development, while ensuring they are treated as part of the team. In our latest blog, we talk to some recent placement students and interns about their experiences working at OSR.

Izzy – Government Statistical Service (GSS) Sandwich Year Placement, now working at HM Treasury

I did a sandwich year placement at OSR, between studying Philosophy, Politics and Economics at the University of Leeds. The world of statistics regulation was not one I had been previously acquainted with, so I was very much going in blind.

I think one of the most eye-opening things I learnt at OSR was just how easy it is for statistics to be misused and misinterpreted! I came to understand that there is so much more to a statistic than just the number in front of you, and how important it is to be critical in your interpretation of data and evidence. This was a really useful lesson to learn, and definitely something I’ve taken forward in my final year of university, and beyond.

My placement highlight was definitely playing a leading role in OSR’s systemic review of poverty statistics. I got to work with departments across Whitehall, as well as interview a range of stakeholders including leading charities, think tanks and researchers. As a university student, it was extremely cool to be speaking to people whose papers I had spent my degree reading! It was also fascinating to learn about the complexities of measuring poverty, and to make a meaningful contribution towards improving how this is done within a space I care a lot about.

Since finishing university, I’ve started working in HM Treasury as a Policy Advisor specialising in economic risks. In my short time there, I’ve already found my year at OSR to have come in incredibly handy. I’ve learnt to engage much more critically with statistics and data when I use them in my work, which, as a policy advisor, is a really vital skill. I’ve also managed to hold my own in quite a few heated debates on inflation measurement, which is definitely not something I could have said before…!

I think statistics are hugely important for allowing the public to hold government to account for the decisions it takes. They enable people to understand and assess the motivations behind policy decisions that have a direct impact on their lives – as well as how effective those decisions ultimately are. In that way, statistics are a really crucial link between politics and the public – which is why OSR has such an important role to play in making that process as transparent as possible.

Ewan – The Government Economic Service Sandwich Student Placement Scheme

I’m currently part-way through my placement year working at OSR, between my studies in Economics at the University of Bath. I’ve been working in the Economy and Business, Trade and International Development domains.

My highlight of this year had to be working on casework. There is no specific casework that comes to mind, but I compare it to being a statistics detective. Hunting round, trying to find a claim’s sources never loses appeal. Furthermore, it is great feeling to find the smoking gun – the source or figure that they are claiming. Lastly, it feels like it has a direct impact. By regulating the use of statistics, OSR builds confidence in the statistics used in the public domain.

I wasn’t aware of OSR before my placement started, but I quickly understood how important it was that both the production and the use of statistics are kept to a high standard. It’s important to consider how statistics are used and the potential damage from their misuse.

The world is complicated, and people aren’t omniscient. Statistics are a way of depicting vast amounts of information in a clear understandable way. But statistics aren’t simply nice little descriptions of the world to be quoted at pubs. Statistics can make or break public policy proposals. The public should have as much information as possible to make informed decisions that best shape our future. This requires accurate and timely statistics which the public can trust. Furthermore, statistics are not just used for public policy as statistics are intertwined with our daily life. People use statistics to decide where their kids go to school or whether to stop smoking. Therefore, it is important that statistics are produced and presented to a high standard.

Following my placement, I will go back to university for my final year of my degree. From there, I plan to apply for the Government Economic Service fast track scheme.  I’ve really enjoyed my time at OSR – everyone has made me feel so welcome, and every day has felt different. The work varies quite substantially, and I rarely repeat a task. There is always something new to do and work rarely feels monotonous.

OSR has helped me develop as a person, both professionally and academically. I have no doubts that the skills I have obtained here will benefit me greatly in my academic and professional career. OSR has highlighted to me that I want to work within the public domain and have a positive impact on the lives of the general public.

Martin – The Summer Diversity Internship Programme (SDIP) (now the Summer Internship Programme (SIP))

Working at OSR was a great way to see first-hand how statistics are regulated. I gained an appreciation for how useful statistics are for users and the importance that they are trustworthy, made with quality and provide public value. I was involved in many projects whilst at OSR and whilst the work was challenging, I found it incredibly rewarding.

At first, I was nervous about working in an office environment, especially one focussed on statistics, as I graduated in film and philosophy. However, I quickly found out that OSR also provides roles that are more analytical than number-heavy, which I was very happy about. On arrival to OSR I was assigned a great line-manager, Grace, who inducted me into the organisation and introduced me to the team. I was very lucky as Grace was glad to answer any questions I had. I was also assigned a buddy, Ewan, who helped me settle in and answer the questions I had.

I completed SMART (Specific, Measurable, Achievable, Relevant, and Time-Bound) targets during my time at OSR. For example, I compiled a data evidence list in which I assessed and recorded whether sources contained specific information. This was an interesting task as it allowed me to see what information each source contained and built my experience in using office tools. I also constructed a survey on development and wellbeing within the department, which allowed me to gain an understanding of the thoughts and feelings of those working in OSR and assess how this could be improved. These tasks were useful as they provided me with transferable skills such as ICT use, project management and analytical skills.

Working at OSR also offers opportunities to work with other government departments. After asking my line-manger if I could do some work within the Ministry of Justice (MOJ), I was given an opportunity to work with Ben and Job from OSR’s crime and justice domain, on a compliance check of MOJ’s statistics on Women and the Criminal Justice System and an assessment of Scottish prison population statistics. I found this work very interesting and am very grateful to have been given an opportunity to work on them.

Even though SDIP is online-based, I was able to visit the Newport office and both London offices, which gave me the opportunity to meet some of my colleagues in person and experience an immersive day-in-the-life on the job.

Overall, I would say the experience was highly positive as I feel I have improved my skillset since starting at OSR. Everyone being kind and friendly has also made it an experience that I won’t forget.


OSR is always keen to hear people’s views on statistics and how they are used. To get in touch with us, or just to stay up to date with our work, follow us on Twitter and LinkedIn, and sign up for our monthly newsletter.

What 2023 means for OSR

In our latest blog, Director General Ed Humpherson reflects upon the past year and sets out OSR’s priorities for the coming year. We are inviting feedback on these in order to develop our final 2023/24 Business Plan. If you have views on what the key issues in statistics are, please email regulation@statistics.gov.uk 

As the days shorten and the end of the year looms, in OSR we start turning our attention to the next financial year, starting in April 2023: to what we will focus on, what we want to achieve, what our ambitions are.  

This is a reflective and deliberative process: we don’t finalise our business plan for the 2023-24 year until April itself. And it’s also a consultative process: we want to hear from stakeholders about where you think we should focus our attention. 

How we develop our priorities

There are a range of inputs into our thinking. We start from our five year strategic plan, which sets out broadly where we want the statistical system to be by 2025, and how we will help the system get there. We form a judgement as to how close the system is to achieving this vision, and what we should focus on to help support it. 

We also draw on our annual state of the statistical system report (most recently published in July) on what it is telling us about what positive things in statistics we want to nurture and what challenges we want to address. And we also take stock of what we’ve achieved over the last 12 months: whether, in effect, we’ve successfully delivered last year’s priorities. 

But there are two further aspects to our business plan that are crucial. First, we can’t do everything. We are a small organisation, and necessarily we have to make prioritisation choices about where we can most effectively support the public good of statistics. And second, we can’t plan for everything. We have to respond to those areas that are salient in public debate, where statistics are helping or hindering public understanding. And we can’t always predict very well what those issues might be. To take an obvious example, our planning for 2020-21, taking place around this time three years ago, did not anticipate that we’d spend most of 2020-21 looking at statistics and data related to the pandemic. 

To help us make these choices, and to be better at anticipating what might be just over the horizon, we would like the input, advice and creativity of our stakeholders: of people who care about statistics and data, and who want to help us be as effective as we can be. 

Our 23/24 priorities

Today I am pleased to share with you our draft priorities for 2023/24. These are deliberately high level. We have not chosen our menu, selected our ingredients, and already got the cake ready for the oven. We want to leave lots of room for people outside OSR to make suggestions and raise questions. 

Our plan for next year has four high level priorities, all focused on different aspects of supporting change and transformation: 

  • Support and challenge producers to innovate, collaborate and build resilience 
  • Champion the effective communication of statistics to support society’s key information needs 
  • Continue to widen our reach beyond official statistics and official statistics producers 
  • Increase our capability as a regulator 

The keen observers among you might note that these are an evolution of last year’s priorities, rather than a wholesale change. We have had a good year, for sure; but as always, we will always strive to do more.  

Please get in contact with us at regulation@statistics.gov.uk to let us know your thoughts and questions about these priorities, or if you would like a wider discussion with us.