Hear about the Office for Environmental Protection’s journey of applying Trustworthiness, Quality and Value (TQV) from Darren Watson, their Principal Environmental Analyst, in this guest blog on embracing the Code of Practice for Statistics
How would a new organisation understand, and most importantly communicate, how – or indeed whether – the government is improving the natural environment? How does it make sense not only of the state of the environment but also the many policies and strategies across government that affect it?
These are the challenges that the Office for Environmental Protection (OEP) has had to confront since its creation a little over 3 years ago.
One way to examine progress is through statistics – in our case, using data such as the area of woodland, or the number of water pollution incidents to present trends in key environmental indicators.
Deciding what to measure and what to present in a concise, understandable way that provides value to stakeholders, and ultimately contributes to improving the environment, is tricky. The difficulty lies in the range of policies and strategies in place, government targets and commitments, the numerous sources of pollution and their wide-ranging impacts, and the priorities and concerns of our stakeholders. Just listing some of our stakeholders – parliamentarians and policy makers, policy influencers, the media and the public – has been enough to make my head spin.
The challenge of measuring progress was evident in our first annual report on government progress in improving the environment, which we laid before Parliament in January 2023. At that stage we presented 32 indicators to measure the environment in England.
The work doesn’t stop there, however. Our progress reports need to evolve in response to the environment and policy landscapes, so we can never stand still. Our team therefore challenges itself to continually improve our assessment, to provide greater value for our users and the environment, and to respond to stakeholder feedback.
So, while we use others’ statistics (from bodies like the Environment Agency), rather than produce them ourselves, we are committed to applying the same high standards to our analyses.
As such, we decided to voluntarily adopt the Code of Practice for Statistics when developing our second progress report. The Code sets the standards to which producers of accredited official statistics (previously called ‘National Statistics’) are held. It is, in short, the best way to hold ourselves and our assessment to account, through striving to meet its three pillars: Trustworthiness, Quality and Value (TQV).
The most visible innovation in our application of TQV was our first Methodological Statement, published alongside our second progress report. At 95 pages, it was not lacking detail. But producing a report is the easy part; it is then how it is used, and the value it provides, that is most important.
So, a particularly proud moment for the team that produced the Methodological Statement came when our chair used it at the House of Lords Environment and Climate Change Committee to demonstrate the robustness of our recommendations. A tick in the box for trustworthiness and transparency.
But there is another example of our more fundamental use of TQV and its true value. And this goes back to our indicators.
Because the environment, the factors affecting it, our stakeholder needs and government are dynamic, our indicators must be too. They must adapt, and this is where using TQV – particularly the Q and the V – is key.
Our assessment process does not stop with the publication of our latest progress report every January. Following publication, we take stock of our progress and review and consult on our assessments. We also review those indicators, which is where the lenses of Quality and Value give us an ideal framework through which to challenge ourselves and be challenged by our stakeholders.
For Value, we ask ourselves:
Are our indicators still relevant to users? Do they, as a whole and individually, provide clarity and insight without making our assessment inaccessible? And do they improve and enhance our understanding, and our users’ understanding, of the environment?
For Quality, we consider:
Are we still using the most-suitable data sources? Have the data producers changed anything or stopped updating their statistics? Are our methods still sound and reflective of best practice? And how can we improve our data handling and quality assurance?
This is quite a list. But practically, the QV challenge has enabled the evolution of our indicators, with 23 new indicators included and 16 amendments made to existing indicators through our second and third reports. This work is driven by our vision to better understand those numerous factors that determine progress in improving the environment and to provide greater value and quality for our users. We hope this effort is demonstrated through the increasing awareness and influence of our assessments and their recommendations.
Our commitment to TQV and continuous improvement goes beyond this work. We are using TQV to examine how we assess trends and whether more statistically robust methods are available. We are also building an improved data system to increase the accuracy, quality and speed of our trend assessments, whilst supporting our ambitions to present data more effectively, including using maps.
So, to conclude: the OEP aims to be a trusted voice, and we are committed to being an evidence-led organisation that holds government and other public authorities to account. What better way to support this than to adopt TQV and show the users of our reports the effort we take to meet those aims?
Oh, and please take a look at our reports – we really do value feedback.