The trouble with measuring poverty

We have since published a Review of Income-based poverty statistics from the time of this blog’s release.

What does it mean to be in poverty? It’s a question that has been debated for a long time and is one of the reasons why measuring poverty is so difficult. There are many interest groups and think tanks who have covered this issue time and time again, such as the Joseph Rowntree Foundation and Full Fact.

The concept of poverty means different things to different people and to some extent, requires a judgement call to be made as to where to draw the poverty line. Generally speaking, being in poverty refers to when people lack financial resources to afford to meet their basic needs.

While it may be difficult to define, it is important for central and local governments to understand the prevalence and nature of poverty in the areas they serve so that they can put targeted support in place. This blog looks at what data is out there to measure poverty and highlights the work being done to improve the future evidence base on poverty.

So what is the best measure of poverty?

There is no right or wrong measure of poverty. Different measures of poverty capture different things, and trends in these measures can vary over time.

No single figure about poverty tells the whole story so context is really important when drawing comparisons of poverty over time.

There are four commonly used income-based measures of poverty produced annually by the Department for Work and Pensions (DWP) in its Households Below Average Income (HBAI) National Statistics publication:

  • Relative poverty (relative low income) – households which have less than 60% of contemporary median income
  • Absolute poverty (absolute low income) – households which have less than 60% of the median income in 2010/11 held constant in real terms
  • Both relative and absolute poverty can be measured on a before housing costs (BHC) or after housing costs (AHC) basis.

These four measures are published by children, pensioners, working-age adults and all individuals. The data below shows the latest figures for children and all individuals. Across all measures, we can see that the number of children in poverty has increased since 2010/11. For all individuals in poverty, the picture is more complicated as the total number in absolute poverty has seen a decrease in this time (by 100,000 individuals both before and after housing costs) whilst the number of individuals in relative poverty has seen an increase (from 9.8 million to 11 million before housing costs and from 13 million to 14.5 after housing costs).

Chart showing the estimated number of children in relative and absolute poverty, before and after housing costs, UK

Source: DWP Households below average income, 1994/95 to 2018/19

Chart showing the estimated number of individuals in relative and absolute poverty, before and after housing costs, UK

Source: DWP Households below average income, 1994/95 to 2018/19

As well as these four measures, DWP produces statistics on material deprivation. This is where an individual or household can’t afford certain necessities and activities that are measured by a basket of goods.

The DWP publishes estimates of the number of children falling below thresholds of low income and material deprivation in its HBAI statistics. The questions underpinning this measure were updated in 2010/11 and the DWP is clear that figures from the old and new suite of questions are not comparable. Since 2010/11, the number of children falling below thresholds of low income and material deprivation has fallen by 200,000.

Chart showing the estimated number of children falling below thresholds of low income and material deprivation, UK

Source: DWP Households below average income, 1994/95 to 2018/19

Material deprivation on its own is not widely used as a measure of poverty as it is not designed to measure low income. However, the combined measure of low income and material deprivation offers a wider measure of people’s living standards which can be used to look at elements of persistent poverty. This measure was the basis of one of the targets set in the Child Poverty Act 2010 aimed at reducing child poverty.

Outside the world of official statistics, there is another measure of poverty produced by the Social Metrics Commission (SMC). The SMC is an independent group of experts formed to develop a new approach to poverty measurement that both better reflects the nature and experiences of poverty that different families in the UK have, and can be used to build a consensus around poverty measurement and action in the UK.

It has been publishing its poverty measure since 2018 which is considered to be the most comprehensive measure of poverty available as it covers the depth, persistence and lived experience of poverty.

What more can be done to improve the evidence base on poverty?

The SMC has been working with the DWP to publish experimental statistics in 2020 that will look to take the current SMC measure and assess whether and how this can be developed and improved further to increase the value of these statistics to the public.

These experimental statistics will be published in addition to the HBAI publication, which will continue to produce the four recognised income-based measures of poverty highlighted earlier. The work on developing these statistics has been paused due to the Covid-19 pandemic but the DWP remains committed to carrying out this work.

Poverty remains a significant issue for the UK and has the potential to be of greater importance as we adjust to life following Covid-19. This is why we are launching a systemic review on the coherence of poverty statistics in Autumn 2020.

We will provide more information on the scope of the systemic review on our website later this year and we look forward to engaging with the public to understand how the quality and public value of official statistics on poverty can be improved, to help facilitate open and fair public debate.

The fact that there are different ways of measuring poverty should help build the bigger picture on poverty in the UK and should not be used as an excuse to be selective with data to support only part of the story. This is something the Chair of the UK Statistics Authority commented on back in 2017, when referring to the then Prime Minister’s comments on child poverty:

We do, however, feel that public debate would be enhanced if the Government indicated more clearly which measure or measures it places greatest weight on and that it was consistent in reporting progress against this measure. It is unhelpful if there is regular switching between what constitutes the key measure.”

Measuring poverty is complicated. There is no wrong measure but there is a wrong way of using the available measures – and that is to pick and choose which statistics to use based on what best suits the argument you happen to be making. It is important to look at all the data available and set the context when referring to statistics on poverty.

Joining Up Data for Better Statistics

To speak to people involved in linking Government datasets is to enter a world that at times seems so ludicrous as to be Kafkaesque. Stories abound of Departments putting up arcane barriers to sharing their data with other parts of Government; of a request from one public sector body being treated as a Freedom of Information request by another; and of researchers who have to wait so long to get access to data that their research funding runs out before they can even start work.

Our report, Joining Up Data for Better Statistics, published today, was informed by these experiences and more.

The tragedy is that it doesn’t have to be this way. We encountered excellent cases where data are shared to provide new and powerful insights – for example, on where to put defibrillators to save most lives; how to target energy efficiency programmes to reduce fuel poverty; which university courses lead to higher earnings after graduation. These sorts of insight are only possible through joining up data from different sources. The examples show the value that comes from linking up data sets.

This points to a gap between what’s possible in terms of valuable insights, especially now the Digital Economy Act creates new legal gateways for sharing and linking data, and the patchy results on the ground.

It leads us to conclude that value is being squandered because data linkage is too hard and too rare.

We want to turn this on its head, and make data linkage much less frustrating. We point to six outcomes that we see as essential to support high quality linkage and analysis, with robust safeguards to maintain privacy, carried out by trustworthy organisations including the Office for National Statistics (ONS) and government Departments. The six outcomes are that:

  • Government demonstrates its trustworthiness to share and link data through robust data safeguarding and clear public communication
  • Data sharing and linkage help to answer society’s important questions
  • Data sharing decisions are ethical, timely, proportionate and transparent
  • Project proposal assessments are robust, efficient and transparent
  • Data are documented adequately, quality assessed and continuously improved
  • Analysts have the skills and resources needed to carry out high-quality data linkage and analysis

The report seeks to make things better. The six outcomes are the underpinnings of this. The report supports them with recommendations designed to help foster this new, better environment for trustworthy data linkage. The good news is that there is a strong coalition of organisations and leaders wanting to take this forward both inside and outside Government. This includes the National Statistician and his team at ONS, strong data linkage networks in Scotland, Wales and Northern Ireland, and new bodies like the Centre for Data Ethics and Innovation, UK Research and Innovation and the Ada Lovelace Institute. Alongside this blog we’re publishing a blog from Jeni Tennison, CEO of the Open Data Institute, which shows the strong support for this agenda outside Government.

We want statistical experts in Government, and those who lead their organisations, to achieve the six outcomes. When they do so, they will ensure that opportunities are no longer squandered. And the brilliant and valuable examples we highlight will no longer be the exception: analysts will be empowered to see data linkage as a core part of their toolkit for delivering insights.

Overcoming barriers to change

I work in the Children, Education and Skills domain in the Office for Statistics Regulation (OSR), not long after I joined OSR a colleague working with me in this area suggested we carry out a review of innovation and improvement. This was prompted because we were starting to hear about some interesting developments in these statistics. Our aim was to find out more about the great work going on in producer teams across the UK to comply with the new innovation principle in the refreshed Code of Practice for Statistics. Ultimately, we thought this would be a practical way we could encourage others along the journey of improving their statistics to meet user needs.

I have personally learned a lot along the way, both about the statistics in this area and about being more innovative. In our previous posts we have explored the importance of collaboration, and shared some ideas to think about when producing alternatives to traditional pdf statistics releases.

However, my previous experience working in the Government Statistics Service (GSS) meant that I wasn’t surprised to hear that it hasn’t all been plain sailing. I hope some of the suggestions we’ve picked up during this review might help you to overcome some of the challenges you face when innovating in the development and presentation of statistics.

Making the time

Producers told us time and resource were barriers they faced to implementing change and yet we heard some great examples of how this is being tackled. We heard about how teams have broken down larger scale improvement projects into manageable chunks to tackle at quieter times of the production process, and about how producers are working across Departmental barriers to share resource. We heard about statisticians working with operational researchers, IT professionals and others in their organisation with an interest in coding to develop outputs together. We also heard examples of different departments sharing coding skills via the Government Data Science Slack group, and about teams getting together with those in other Departments to share lessons learned. All of this helps build capability and share knowledge, as well as saving time.

Getting your innovations out there

We heard that once producers have freed-up the resource and developed the skills to produce alternative outputs such as dashboards and interactive displays of statistics, there can be big hurdles to getting outputs into the public domain. This is often because of the Government Digital Services (GDS) restrictions on the type of content that can be hosted on GOV.UK and the restriction to the development of alternatives to GOV.UK.  At OSR, we recognise that while GDS restrictions are in place for good reasons, they can be a barrier to innovation in the presentation and dissemination of statistics

We’ve found some teams have worked with GDS to publish experimental tools, for example, to collect evidence of user needs for their new outputs. If you, like many teams we heard from, need support or advice on developing alternative dissemination platforms to host interactive content, you could contact the Chair of Presentation and Dissemination Committee’s Web Dissemination sub-group, We will also look at ways OSR can further support this work.

Meeting user needs

As part of the review, we spoke to users of some of the statistics to find out how the changes had impacted them. When users had been involved in the developments, they were usually full of praise for the changes. However, in some cases the users weren’t aware of the projects, and didn’t understand why outputs had changed. We noticed there was often no public statement or easily accessible information about the plans, the aims or timescale of the innovation project.

A transparent approach would keep users up to date with planned changes and allows producers to share information of any likely impact on the statistics – such as how a change in methodology or data collection may affect the quality or consistency of the statistics – and what steps are being taken by producers to address these. It would also open the door to wider user feedback and input in the development stages of an improvement project. Thinking more about how you tell users about changes and developments may help with Code compliance, but more importantly, it is likely to help you to make the most of the opportunity to improve your statistics.

Get in touch

At OSR we are keen that we support producers to understand the requirements of the Code and we hope this series of blogs and articles will go some way to seeing what the new Innovation and Improvement principle means in practice. We also hope it has given you an idea of the types of good practice we might hope to see when assessing statistics. While this is the last planned web piece for this review, we are keen to explore further with the GSS Good Practice Team other ways to share the things we learned. We are also keen to hear about other innovations going on in the GSS to incorporate as case studies in the interactive version of the Code of Practice, so please do get in touch if you are doing anything interesting.

I’d like to thank all the producers who we spoke to during this review, it was exciting to see the range of activities producers were undertaking and their commitment to improvement was admirable.


Improving and innovating: enhancing the value of statistics and data

Lessons from statisticians producing Children, Education and Skills statistics

Statistics are of value when they support society’s need for information; they should be useful, easy to access, remain relevant, and support understanding of important issues. To help deliver this, producers of statistics should commit to continuously improve their service to users.

I have been part of the team working on the refresh of the Code of Practice of Statistics. There have been various changes within the Code but without a doubt the area which I am most excited to see enhanced is the new Innovation and improvement principle. At the Office for Statistics Regulation we have always expected producers of statistics to adapt so statistics can better serve the public but now this expectation is crystallised in the code.

During conversations about the development of the Code I received several questions about this area and I felt there was a sense of nervousness about how it might be applied; this is understandable with anything new. The new principle is about having a positive mindset to change and improvement across all statistics production and dissemination processes. However, the practices which sit beneath the principle are not a prescriptive list of what must be done instead they should be applied proportionately depending on the statistics in question. As a result, how producers respond to this principle will differ in scale and approach. What is most important is producers’ motivation to improve their statistics.

I was keen to undertake a small project to help support producers of statistics get a better handle on what the Innovation and improvement principle meant for them. My colleague Louisa and I both focus on Children, Education and Skills (CES) statistics. This thematic way of working gives us the opportunity to better understand policy and statistics issues, and develop relationships with a range of users and producers of CES statistics. From our ongoing conversations we were aware of several innovations in this area of statistics, such as work to develop the Longitudinal Education Outcomes data which is relatively well known about. We wanted to find out more about other projects however, to learn about less well publicised developments or smaller scale projects which, nonetheless, reflect an ambition to improve the value of the statistics.

We started by asking producers of CES data and statistics across the UK to send us information on the projects they had been working on. We were pleased by the range of responses we received. The projects, whether completed or still in development, varied in scale and covered everything from producing statistics using Reproducible Analytical Pipelines to improved accessibility to data.  It was clear to us that improvement, whether big or small, was embedded in much of the activities of producers – it was great to hear just how enthusiastic producers were about their projects. We also spoke with users to get their feedback on some of the development work, to find out how they have benefited from the improvements being made. Here is a link to a summary of the innovation and improvement projects producers told us about.

Over the coming weeks we want to share with you some of the common themes that became apparent from talking with producers and users linked to these projects. Firstly, we want to look at the importance of collaborative working when developing statistics, then at the development of alternative statistical outputs, and finally at some of the common challenges producers face when improving their statistics. While these themes have come from examples taken from Children, Education and Skills statistics it is intended to give all producers of statistics a better sense of what the new Innovation and improvement principle might mean to them and highlight elements of good practice we might expect to see when assessing statistics.

As this review is considering innovation in statistics, we ourselves wanted to be more creative in thinking about how we would share our findings. Instead of a more traditional report, we are going to publish our work across a series of web posts.  We will also be exploring, with the Government Statistical Service’s Good Practice Team, how else we might support producers undertaking innovation and improvement work.

For now, keep an eye out for our forth coming posts, and if you want to get in touch, linked to this review or on CES statistics more generally, please do email.