Getting the data to support decisions on adult social care

The Coronavirus (COVID-19) pandemic has put the adult social care sector under the spotlight and the Office for National Statistics has responded to the demand for trustworthy, high quality insight on the impacts of COVID-19 by providing analysis using new data sources. To further improve data sharing and fill gaps in evidence for this sector, the ONS is introducing steps to improve social care statistics. In this guest blog, Sophie John explains more.


Recently the Office for Statistics Regulation (OSR) published papers outlining gaps in evidence in social care across the four nations of the UK. Then the COVID-19 pandemic hit, which had a significant impact on care home residents and recipients of domiciliary care. The impact of the pandemic highlighted, in line with the OSR report, the need for increased information on adult social care as the demands on services continue to rise.

The OSR report also highlighted that, historically, social care has not been measured with the same depth of data and analysis as healthcare due to a scarcity of funding. This is problematic for researchers, academics and policy makers who require sufficient evidence upon which to make informed decisions.

Today, we have made our first step towards making improvements to accessibility by releasing a new interactive tool where users can easily explore the landscape of adult social care data in one place.

The new interactive tool compiles official statistics relating to adult social care across England, Scotland, Wales and Northern Ireland and each month it will be updated with new publications for users to browse, including fresh insight on the effect of the COVID-19 on the care sector.

Next steps…

While we have met one of the OSR goals of improving the accessibility of official social care statistics this is only the first step of many required to continue to improve evidence around this sector.

The OSR report also highlights the need for improved leadership and collaboration. Our aim is to engage with stakeholders across the four nations, whilst working with the Government Statistical Service Harmonisation team, to help make statistics more comparable, consistent and coherent. Engaging with organisations working in similar areas, we will endeavour to ensure that work is joined up and well informed by other experts.

Further, we are working to identify the gaps in evidence in adult social care data. Areas of interest include investigating data availability on unpaid carers and self-funders to seek to improve knowledge of individual care journeys and outcomes.

Following our releases on Deaths involving COVID-19 in the care sector, England and Wales we plan to produce a new annual release for ONS reporting deaths in care home residents which will be released later this year. This will help us understand more about the causes of death in care home residents, including characteristics, to inform policy.

The ONS, working with partners across the sector can play an important leadership and coordination role within adult social care statistics and our interactive landscape tool is our first step towards achieving this.

This is a guest blog from Sophie John (Head of Adult Social Care Analysis, ONS)

Adult social care statistics across Great Britain: the power and potential for change

Social care is in crisis across Great Britain. With responsibility for social care being a devolved matter, each country has its own system in place but all share common problems. Given the urgency of the crisis, which is exacerbated by the current Covid-19 pandemic, there is growing interest in what others are doing and what works and what doesn’t. While a fair bit of work has been done looking at the international picture, we know less about what is happening closer to home. What are the other British countries doing, what do they have in common and where do they diverge? What can we learn from each other’s successes and failures, and what solutions can we adopt?

At the Nuffield Trust, we have spent the last couple of months immersing ourselves into the different social care systems across the UK, trying to get underneath their peculiarities and identify common ground and points of learning. But what we found was a frustrating lack of high-quality, comparable data. Delving into the data landscape across the UK countries, we quickly learned to realise the limitations of our undertaking and the urgent need for data improvements.

Within this context, we welcome the publication of the review of adult social care data across Great Britain by the Office for Statistics Regulation (OSR). The call for better data and greater collaboration across the countries is timely, and the individual country reports are an invaluable resource for understanding the strengths and weaknesses of social care data within each country and the underlying policy differences.

Why data matters

Lack of funding in social care does not only mean that services are unable to meet demand. Equally important is the under-investment in data and analytics. This is not just a frustrating problem for the small group of social care researchers and data enthusiasts. Poor data matters in very practical terms. It hinders the development of a strong evidence-base to solve problems and guide effective policy making aimed at improving care delivery and the quality of life of service users and their families. It means that we cannot understand if one country’s approach has been successful and how it might work in another place.

For instance, during the run up to the 2019 general election, some proposed the introduction of free personal care in England akin to the system established in Scotland. To assess whether this or other proposals would be a viable or desirable option for England, we need to look at data. It sheds light on the distinct contexts, the nature of the different social care systems and the impact of different policies and initiatives. It allows for comparisons and evaluation of best practice and it lends concrete evidence to lived experience and anecdotes. Sounds simple but such a comparison proved far from straightforward.

So what would help?

  1. Ease of access

Easy and straightforward access to relevant data is crucial to aid research, policy making and the public in making informed decisions. However, similar to the data users the OSR spoke to, we struggled to find relevant statistics, often having to navigate different websites and sources, in some cases with data deeply hidden. Of help were portals aiming to collate all available data, such as the National Social Care Data Portal for Wales. Seeing this realised across the countries will help to increase the use and profile of social care statistics.

  1. Comparability

Once data has been accessed, we need clear definitions and improved guidance on what statistics entail, as well as detailed statistical commentary. Across all countries, data quality issues and missing returns are a problem where their impact was not always clear.

Another core issue identified by the OSR report, and in our work, is the comparability of data across social care activities, time and regions. Wanting to compare key social care indicators across the countries, we often struggled to understand what exactly is captured. Even comparing headline figures such as the number of people receiving care in each country was far from straightforward. For instance, does access data include signposting to services as opposed to actual delivery of some form of social care activity? Does it include meals on wheels or are those reported separately?

  1. Consistency

In addition, wanting to evaluate the impact of different policies and approaches, we require consistent data collection and reporting over time, but are faced with breaks in timelines and changing definitions and processes. As highlighted by the OSR, where possible, data producers in all countries should follow harmonised principles, such as on informal care, which would aid comparability across regions and settings.

  1. Comprehensiveness

Linked to this, data did not always meet our needs. We would have liked to get more understanding of characteristics of those receiving care, such as more granular age breakdowns, their level and nature of needs, and the type of care and care setting. And we identified many gaps in the data that limited our ability to understand the intricacies of the different systems. Across the board, robust data on informal caregiving or self-funders are lacking and estimates at best. Likewise, we have little understanding of undermet and unmet need, making it difficult to appreciate how well the different systems meet demand and planning for the challenges ahead.

The way forward

The publication of the review of adult social care data across GB is very timely and welcome. It is an important advocate for change, calling for improvements to existing statistics, including data coherence, standardisation and quality, filling data gaps, and sharing of best practice. Given the Covid-19 crisis, the need for good data has become more apparent than ever, as has the crucial role of social care in our society. Implementing the report’s recommendations will aid researchers in developing solutions through a robust data landscape and facilitating cross-border cooperation and learning. This will ultimately improve care for all citizens across the UK.

We hope that respective bodies will take the findings to heart, seeking to improve ease of accessibility and data quality to facilitate greater use of high-quality data, standardising processes to ensure comparability across settings and countries, and filling the gaps in the data. Where resources are low and problems are shared, this is best done in collaboration.

This is a guest blog from Laura Schlepper (Research Analyst, Nuffield Trust)

Health and Social Care Statistics

In February, I asked NHS England and NHS Digital to consider publishing more timely statistics on accident and emergency performance. In particular, I asked them to look at the time lag between collection and publication of the statistics.

I’m impressed by their response, which we’ve put on our website today.

They propose to reduce the time lag from 6 weeks to 2 weeks, along with changes to provide more timely data in other important areas, such as ambulance services.

This is great news for users of these statistics, and it fits with the Office for Statistics Regulation’s broader drive to improve the coherence, accessibility and value of health and social care statistics.

Over the past two years I have focused a lot on health and social care statistics. This reflects the underlying value of these statistics. They have the power to help save lives and to help make life more comfortable for those who need it – and of course it has ever been thus: iconic figures in the history of statistics like John Snow and Florence Nightingale crop up frequently (perhaps to the point of cliché…) in speeches and presentations about the power of statistics to change lives.

The Office for Statistics Regulation identified the need for improvements in the accessibility and coherence of and insight provided by health and social care statistics in England in 2015. Since then we’ve proceeded by dialogue, with producers (though a series of round tables) and with users, through our health summit in July 2016.

After our most recent round table, we’re more confident that producers of these statistics are taking ownership of the project to improve health and care statistics. We decided that this work no longer needed us to organise or chair discussions. Producers themselves will carry on this cross-departmental dialogue.

The changes to accident and emergency statistics announced today further reflect progress that producers are making.

My team is continuing to review health and care statistics to ensure user needs are being met and we are also going to be investing time in holding a second ‘Health and Social Care Statistics Summit’ in the late Autumn. If you would like to participate in this or have any thoughts on areas we should be looking to do more work – whether commending progress or seeking improvements – please let me know.

And of course it’s far too early to declare victory. Health and social care statistics, with their power to save and improve lives, will remain a key priority for me. I look forward to and expect many more cases of reform and change.

Health statistics

In the last few weeks, we’ve made three comments on health statistics – one in England, about leaks of accident and emergency data; one in Scotland, on statistics on delayed discharges; and one on analysis at the UK level. They all show the importance of improving the public value of statistics.

On accident and emergency statistics, I wrote to the heads of key NHS bodies in England to express concern about recent leaks of data on performance.

Leaks of management information are the antithesis of what the Office for Statistics Regulation stands for: public confidence in trustworthy, high quality and high value information.

It’s really hard to be confident about the quality of leaked information because it almost always lacks context, description, or any guidance to users. On value, leaked information usually relates to a question of public interest, but it’s not in itself valuable, in the sense it’s not clear how it relates to other information on the same topic. Its separated, isolated nature undermines its value. And it’s hard for leaked information to demonstrate that it is trustworthy, because the anonymous nature of the “producer” of the information (the person who leaked it) means that motives can be ambiguous.

But leaks can highlight areas where there is concern about the public availability of information. And that was the constructive point of my letter: the NHS bodies could look into reducing the risk of leaks. One way of doing this would be to reduce the time lag between the collection of the information on accident and emergency performance, and its publication as official statistics. This lag is currently around 6 weeks – 6 weeks during which the performance information circulates around the health system but is not available publicly. Shorten this lag, I argue, and the risk of disorderly release of information may also reduce.

The comments on Scotland relate to the comparability of statistics across the UK. When NHS Scotland’s Information Services Division published its statistics on delayed discharge from NHS hospitals for February, the Cabinet Secretary for Health and Sport in the Scottish Government noted that these figures compared positively to the equivalent statistics in England.

This is of course an entirely reasonable thing for an elected representative to do – to comment on comparative performance. The problem was that the information ISD provided to users in their publication on how to interpret the Scottish statistics in the UK context was missing – it wasn’t clear that Scotland figures are compiled on a different basis to the England figures. So the comparison is not on a like for like basis. The difference wasn’t stated alongside the equivalent statistics for England either. This clarification has now been provided by ISD, and NHS England have agreed to make clearer the differences between the figures in their own publication.

For us, it’s really important that there is better comparability of statistics across the UK. While there are differences in health policy that will lead to different metrics and areas of focus, it’s quite clear that there is public interest in looking at some issues – like delayed discharge – across the four UK health systems.

In this situation, good statistics should help people make sound comparisons. Yet, with health and care being a devolved matter, there are some constraints on the comparability of statistics across England, Wales, Scotland, and Northern Ireland.  And, to the untrained eye it is difficult for users to know what is or is not comparable – with delayed discharge data as a prime example. This is why we really welcome the recently published comparative work, led by Scottish Government, where statisticians have created a much more accessible picture of health care quality across the UK, pulling together data on acute care, avoidable hospital admissions, patient safety, and life expectancy/healthy life expectancy across all 4 UK countries.

Both these cases – the leaks and comparability – illustrate a broader point.

Health statistics in the UK should be much better. They should be more valuable; more coherent; in some cases more timely; and more comparable. If statistics do not allow society to get a clear picture in good time of what is going on, then they are failing to provide public value.