Improving health and social care statistics: lessons learned from the COVID-19 pandemic

Published:
7 October 2021
Last updated:
7 October 2021

Responsive and proactive

Lesson 3

The pandemic reinforced the need for statistics to inform society about public health and provide an understanding of how public health programmes are working.

Statistics producers should continue to develop outputs which go beyond operational data in order to support policy evaluation and a better understanding of public health.

 

Official statistics on health and social care are often a by-product of the delivery of services. This means that they often focus on operational delivery, such as waiting times or hospital activity, rather than emerging population health issues and how they relate to other societal and economic issues. The pandemic demonstrated that good quality and timely data on operational delivery are essential. They helped governments and the public understand the ability of hospitals and care homes to cope with the increased pressures they were facing. It is also important that the public understand how government money is spent on delivering health services. But the pandemic showed that as a society we need to have a good understanding of population health issues – for example, in the case of the pandemic, to answer questions about the spread of coronavirus infections or the long-term impact of coronavirus infection on an individual’s health and wellbeing. It is important that producers continue to collect, publish, and develop information which goes beyond operational data and supports a broad understanding of public health. This may be as part of public health surveillance carried out by public health bodies, or as producers have done through the Office for National Statistics (ONS) COVID-19 Infection Survey, or Public Health England’s analyses of vaccine effectiveness. These data contribute to the evidence base required to understand public health, which includes academic research and other types of government analysis, such as social research.

The pandemic highlighted the importance of evaluating public health programmes. When we reviewed NHS England Test and Trace statistics in July 2020, we were clear that there were key questions which the publication was not able to answer, such as what proportion of people asked go on to complete self-isolation. We asked the UK Government to find ways to better understand the programme and its impact on the pandemic. We noted that this was unlikely to be answered solely through management information from the Test and Trace programme. We saw many significant improvements in Test and Trace statistics over the following year, as well as the publication of some information about the effectiveness of the programme based on modelling. We welcome the Department of Health and Social Care’s recent publication on the impact of NHS Test and Trace on transmission. However, many important questions about the programme have remained unanswered – both as the UK went into the winter 2020 wave of the pandemic, and when we carried out a second review of the statistics prior to changes to self-isolation rules in England in August 2021. These include questions about the vaccination status of those who test positive, and how many close contacts identified through the tracing system go on to test positive. In future, governments must ensure the early collection and publication of measures which allow public health programmes to be scrutinised and evaluated.

Recommendations

  • Governments must evaluate public health programmes and be transparent in sharing information about them with the public.
  • ONS should consider how the COVID-19 Infection Survey can be adapted to play a role in understanding public health in future. This should be done in collaboration with other producers, including those in devolved administrations, and should consider the sustainability of representative population sampling.
  • For the statistical system to be able to respond quickly to future health and social care issues, horizon scanning and sharing plans with each other should be focus areas for collaboration between producers. It is also important that analytical resource and data infrastructures are flexible enough to adapt to new demands.
Back to top

Lesson 4

The pandemic exposed gaps in available data.

To ensure that statistics best serve the public good, these gaps must now be filled. Statistics producers should be proactive in meeting user needs to minimise gaps in future.

 

The pandemic exposed and reinforced existing data gaps, for example in social care, ethnicity, and mental health. This made it challenging for producers to meet user needs. For example, prior to the pandemic there was not a clear understanding of the number of people and the personal characteristics of people in care homes. As a result, in the early days of the pandemic, there was a lack of understanding about the impact of COVID-19 on people in care homes and initial data focused only on deaths in hospitals.

It is positive to see producers working hard to address these gaps. The recent development of statistics for adult social care in England now provide monthly information about vaccinations, infections, testing and availability of personal protective equipment in care homes. Collaboration between NHS Digital, ONS, NatCen Social Research and academics in England resulted in the large-scale Mental Health of Children and Young People survey. This survey provides a unique insight into the mental health of children and young people during the pandemic. Our recent review of statistics about children and young people found that key statistics about COVID-19 have made this part of the population more visible. However, there are notable gaps in the information available, such as separate analyses for vulnerable children and data on social outcomes which give children themselves a voice. Overall, data gaps have often taken considerable time to address, not least because of the pressures of the pandemic which producers were working under, and in some cases gaps remain.

It is tragic that it has taken the pandemic to focus attention on issues such as gaps in social care data. However, we are now encouraged to see greater attention to this issue from governments across the UK and strong commitments to making improvements. In Scotland, data on social care is a focus of the Independent Review of Adult Social Care in Scotland and ‘The Future of Social Care and Support in Scotland’ Parliamentary Inquiry. The Chief Statistician in Scotland also recently outlined the ambition to improve the completeness of data on protected characteristics. In Wales, there is a focus on improving the way social care data are used, and we are pleased to see a commitment to create an Equality Data Unit in the Welsh Government’s Programme for 2021 to 2026. There is also the ongoing programme of work by the Race Disparity Unit to improve the quality of ethnicity data on health records in England.

Recommendation

  • Producers must work with users to understand and address existing data gaps, such as those on social care, mental health, and ethnicity and other protected characteristics. The long-term and indirect impacts of the pandemic on society’s health and wellbeing will also need to be understood.
Back to top

Lesson 5

Data infrastructure impacted the ability of some statistics producers to respond to the demands of the pandemic.

Flexible and joined-up data infrastructures are needed so producers can respond quickly to new data needs.

 

The speed of response of producers during the pandemic was highly impressive. As we highlighted in our State of the UK Statistical System report earlier this year, producers were quick to set up new data collections, data sharing processes and to publish new statistical releases. The sense that this was a contribution to a national effort, less bureaucracy, and a focus on ‘good enough’, rather than waiting for perfection, helped producers to be responsive and agile.

Where good data infrastructures were already in place, producers found it easier to adapt them and respond quickly. However, new operational systems created in response to the pandemic did not always consider future data and publishing needs – such as the issue of understanding the NHS Test and Trace programme in England discussed in lesson 3 above. We recognise that challenges with new operational systems will sometimes result in delays to publishing data. When this happens, producers should provide a clear explanation to users.

Existing challenges relating to data infrastructure also presented problems for producers – for example, fragmented data owned by multiple bodies, legal barriers, inconsistent formats, legacy software, and non-digital data. These issues resulted in challenges for data sharing and linking, and an over-reliance on burdensome manual processes – though we welcome the fact that some producers have since automated these processes. The UK coronavirus dashboard and the NHS COVID-19 Data Store are two examples of outputs produced using automated pipelines. We encourage producers to build on these developments, automating processes using code where possible to reduce the risk of error and create processes which are sustainable and flexible – as recommended in our report on Reproducible Analytical Pipelines. Producers should continue to consider what analytical code can be published or shared with other organisations to enhance transparency, reduce duplication of effort, and improve consistency – for all analysis, not just that relating to COVID-19.

We welcome the focus on improving technical infrastructure and data architecture in the draft data strategy for NHS England. The Department of Health and Social Care should ensure that those working on official statistics are included as key stakeholders in these efforts. Producers will also benefit from collaboration in this area – to learn from each other’s successes, such as the National Clinical Data Store in Scotland and the National Data Resource in Wales.

Recommendations

  • There is a need for improved, system-wide data infrastructures within each country. This will be challenging, but producers should learn from each other’s successes to make progress.
  • Governments must involve statisticians and analysts when setting up new operational systems so that analytical requirements and publishing plans are built-in from the start, and so that there is a good understanding of the expectations of the Code of Practice for Statistics.
Back to top

Lesson 6

Flexible use of analytical resource supported the impressive work by statistics producers.

Sufficient investments in recruitment and retention of skilled statisticians are required so statistics continue to be sustainable and responsive.

 

The impressive response by producers required a large amount of resource and dedicated effort. Producers were quick to review existing work, deciding what could be paused to allow extra resource to be dedicated to the pandemic response. This included moving analytical staff between departments and organisations through a programme of loans and secondments, often supported by the Analysis Function’s analytical resourcing hub. Directors of Analysis played an important role in quickly redeploying staff to work on the pandemic response. However, in hindsight some producers reflected that this flexible sharing of resource could have been done more, and more quickly. The Analysis Function should consider the lessons from the approach taken during the pandemic and develop a model of resource-sharing which most effectively supports new demands in future.

Many producers feel that the past 18 months have been relentless, with staff working round the clock and through weekends and holidays – this level of work has continued for far longer than anticipated at the start of the pandemic. Maintaining staff wellbeing and morale has been challenging. Many producers feel they are still responding to the pandemic while trying to reinstate previous work which was paused and meet new demands. A significant challenge for some producers is recruiting and retaining people with the right skills. In the past we have heard anecdotally that this was a challenge for the whole statistical system. The increased burden on producers during the pandemic appears to have exacerbated it further. However, relatively little is known or published about new starters, leavers and vacancies in the Government Statistical Service (GSS) – a data gap we highlighted in our Statistical Leadership report earlier this year. It is important to collect and share this information so that the scale of the problem can be assessed and effectively resolved.

Recommendations

  • To respond effectively to new priorities, analytical resource should continue to be shared flexibly across governments and non-government bodies. The Analysis Function should consider a longer-term resourcing approach across departments and professions, and also consider how the model used during the pandemic can be even more agile in future.
  • Producers should continue to review their existing statistical releases. They should decide, based on balancing user needs with resource and the ongoing burden on staff, which statistics should be continued, restarted, or stopped, and how or with what frequency.
  • The GSS People Committee should regularly monitor the membership of the Government Statistician Group, including information about central GSS recruitment and professional membership by organisation. The Analysis Function should regularly collate and publish data on the Analysis Function, including breakdowns by profession to better understand similarities and differences across analytical professions.
Back to top
Download PDF version (250.30 KB)