Mark Pont to Ken Roy (Defra), Ingrid Baber (SEPA), Stephanie Howarth (Welsh Government) and Conor McCormack (DAERA): Local Authority Collected Waste Management statistics

Dear all,

Local Authority Collected Waste Management statistics

We have reviewed compliance with the Code of Practice for Statistics of the four sets of waste statistics published by the Department for Environment, Food and Rural Affairs (Defra), the Scottish Environment Protection Agency (SEPA), the Welsh Government and the Northern Ireland Department of Agriculture, Environment and Rural Affairs (DAERA):

We reviewed the trustworthiness, quality and value of the statistics, including the coherence of the data source, methods and quality assurance (QA) arrangements, and the presentation of the statistics.

I am pleased to confirm that the statistics for England, Wales and Northern Ireland can continue to be designated as National Statistics. The statistics for Scotland are not currently designated as National Statistics; they are official statistics. We carried out a more comprehensive review of the Scottish statistics and gathered feedback from a small number of users to support their continued development towards National Statistics status.

The waste statistics are one of the key sets of environment statistics. They provide high-quality information about the volume of waste generated and recycled at the local authority level. As waste and recycling are primarily local issues, this level of granularity is essential. The statistics and data are also important at the national level, as they are used to monitor progress against waste and recycling targets. Issues relating to resources and waste, in particular, plastic pollution, continue to be a focus of public interest and debate on the environment and these waste statistics contribute to public understanding of these issues.

Our key findings and recommendations across all sets of statistics are presented below. Detailed findings by country are presented in Annex A. We encourage you, where relevant, to reflect on the findings for other countries, to learn from their approaches and practices.

We identified several shared strengths:

  • All local authorities submit waste data through the WasteDataFlow (WDF) Data are entered and processed in a standardised way, generating a robust data series on household waste for each country, although countries define and categorise waste slightly differently. Most countries have introduced a more flexible question format which allows local authorities to report more accurate and complete information on waste treatment and disposal.
  • All countries maintain strong, constructive relationships with data suppliers (local authorities). For instance, Defra and SEPA hold regular WDF user group meetings, which provide a forum for discussing data quality issues and developments and gathering feedback on the statistics from local authorities. It is good that you each took into consideration the resource pressures that local authorities have faced during the COVID-19 pandemic, supporting them where necessary (for example, by extending data submission deadlines), and that you have are adapting your QA checks to ensure that data quality remains high. We welcome this ongoing, proactive engagement with data suppliers, who are also key users of the statistics.
  • In general, the four countries collaborate closely when collecting waste data and producing waste statistics. For instance, all countries are represented on the WDF Operational Group, which meets annually to discuss methodology and processes as well as ad hoc WDF and local authority statistics issues. We also heard how statisticians worked together to develop the harmonised UK household waste measure (‘waste from households’) and are currently collaborating on developing a new waste tracking system (see below).
  • The ‘waste from households’ measure, which is used by UK and Devolved Governments to monitor and report compliance with the EU’s Waste Framework Directive recycling targets, allows users to compare recycling rates between UK countries. Defra presents a summary of this measure in a separate bulletin, UK Statistics on Waste, but some countries also comment on trends in the ‘waste from households’ measure in their bulletins and explain how it differs from country-specific household waste measures.
  • Defra, with the support of governments and regulators in Scotland, Wales and Northern Ireland, is leading the development of a new, innovative electronic waste tracking system. It aims to create a single point of reference for a waste transaction that can be used by all four countries. The new system will likely replace existing waste data collections, including WDF, and is expected to fill known information gaps (such as on what happens to waste when it moves from production to recovery or disposal, and flows between recycling facilities) and improve the coherence of waste definitions across the UK.

We also have some general recommendations and suggestions for improvements across all sets of statistics:

  • The credibility of recycling is dependent on final overseas destinations. There is still confusion among the public about where the UK’s recycling goes, with a significant quantity of some types of material being shipped abroad for recycling. To support public understanding of recycling, we recommend that waste statisticians in all four countries work together to produce an accompanying “explainer” on how waste is defined as having being recycled and how it counts towards the headline recycling rate. We think it would be helpful if it also summarises the main definitional differences in recycling rates between countries and explains why figures based on the harmonised measure do not always match up across different outputs, for instance, due to data revisions.
  • To enhance transparency around user engagement, and encourage further engagement from users, we recommend publishing a summary of existing and planned user engagement activities. It should explain how you are listening to users and acting on their views to develop the statistics. You may like to consult our review of user engagement in the Defra Group to inform your thinking in this area. We encourage you to share learning and insight from user engagement with waste statisticians in the other countries.
  • COVID has impacted many local authorities’ waste collections and waste data submissions. This is expected to have knock-on effects on the statistics, especially the volume of waste generated and recycled. It is important that you understand the effects of COVID on data quality and trends in the statistics and explain these to users.
  • The new accessibility regulations for public sector bodies came into force on 23 September 2020. You should ensure that your outputs, including the statistics bulletins, and statistics landing pages meet these requirements and reflect on any additional steps you could take to make your statistics more accessible to users, in line with the Code of Practice.

Our Agriculture, Energy and Environment team will continue to engage with you in the coming months to follow up on the highlighted areas for improvement. We thank you and your teams for your positive engagement throughout the review process.

I am copying this letter to Alex Clothier, Katherine Merrett and Andrew Woodend (Defra), Peter Ferrett (SEPA), Stuart Neil (Welsh Government), and Siobhan Carey (Northern Ireland Statistics and Research Agency).

Yours sincerely

Mark Pont

Assessment Programme Lead

 

Annex A: Key findings and recommendations by country

Local Authority Collected Waste Management Statistics for England (Defra)

Strengths

  • The team has a good understanding of the main users and uses of the statistics and interacts with users and other stakeholders in a range of ways. It has a strong working relationship with Defra policy colleagues, as evidenced by close collaboration during the response to the pandemic. It engages with industry bodies, waste management consultancies and local authorities through a resources and waste data group, which meets regularly to discuss use of waste data for infrastructure planning. The team also engages with organisations such as the Waste and Resources Action Programme (WRAP) on specific data issues. We welcome that different types of users are involved in the development of the data and statistics.
  • The bulletin is well-structured and engaging, with impartial commentary that explains the main differences and trends in the statistics. Relevant new content continues to be added to the bulletin, for example, on final waste destinations. The visualisations are effective and aid interpretation of the statistics. The maps are particularly informative, illustrating the variation in recycling rate, and how it has changed since the previous year, across England.
  • The methodology summary published alongside the statistics contains a good overview of the WDF system and highlights its main limitations. It explains the nature of changes in methods and their impact on the statistics. Definitions of the three reported measures of household waste are clear and highlighted prominently throughout the bulletin, supporting understanding for non-expert users. We are pleased that the team has been using our Quality Assurance of Administrative Data (QAAD) framework to review QA arrangements and welcome the level of detail published about the QA process.

Areas for improvement

  • To gain further insight into user needs and uses of the statistics, the team could be more proactive in its user engagement. For instance, it could attempt to identify and engage with potential users of the statistics, including academics.
  • To ensure that the information on the statistics landing pages is current and helpful, we recommend refreshing all landing pages. The pages would benefit from more information on what the waste data are and why they are collected. For instance, to highlight their wider relevance and importance, we suggest adding an overview of or links to the key waste policies in England, including the Resources and Waste Strategy and the 25 Year Environment Plan. The Defra air quality and emissions statistics landing page provides a good example of how to do this.
  • To enable users to easily explore the rich dataset and facilitate re-use of the data, the team may like to consider developing an interactive dashboard or data tool like those DAERA or SEPA have developed.
  • The UK Statistics on Waste bulletin highlights the extent to which England is meeting its waste and recycling targets. We think it would be helpful if the local authority collected waste management bulletin also commented on this aspect of the statistics.
  • The waste data collection process in England is more complex than in other UK countries due to the large number of local authorities and the involvement of a contractor (Jacobs). To help users understand the flow of data through the system, the team could add a process map that illustrates the different stages and QA arrangements.
  • We encourage the team to explore the feasibility of developing a metric like SEPA’s carbon metric, which measures the whole-life carbon impact of waste. Such a measure would add insight on waste’s contribution to climate change in England.

Household waste statistics for Scotland (Scottish Environment Protection Agency)

Strengths

  • The team engages effectively with users inside and outside government and employs a range of approaches to understand use and listen to users. It works closely with policy teams in SEPA and analysts in the Scottish Government and Zero Waste Scotland. It also engages regularly with academics and industry bodies through Scotland Waste Data Strategy activities. In 2019, the team conducted a user survey to better understand the users of the waste statistics, their views on the presentation and content of the statistics bulletins, and to identify gaps in their needs. We encourage the team to continue this proactive engagement to ensure that the statistics are relevant and insightful for all types of users.
  • The Waste Data Strategy hub on Scotland’s Environment web, maintained by the statistics team, is an excellent repository of waste-related information in Scotland. It brings together news and updates, documents from user events, guidance, case studies and data. The two interactive data tools on the website aid interpretation of the statistics by allowing users to produce customisable charts and data sets.
  • The users we spoke to told us they valued the statistics bulletins and datasets as the provide fixed points of reference which reduce the risk of misinterpretation. The bulletins give a good overview of the short- and medium-term trends in the statistics. The commentary is impartial and reasons for changes are discussed. Information about waste policies, regulations and targets in the quality report and on the SEPA website is clear and helpful.
  • The carbon metric, which measures the whole-life carbon impact of waste, is innovative, insightful and well-established, giving an indication of waste’s contribution to climate change in Scotland. We look forward to seeing the continued refinement of this valuable metric.
  • The QA process for the data is rigorous and continually improving. The users we spoke to recognise the effort the team invests in maintaining data quality. The team recently conducted a survey to better understand the QA principles, standards and checks undertaken by local authorities, which provided useful insight. It is also good that the team developed an automated data validation tool for local authorities to improve and standardise the level of QA.

Areas for improvement

  • It important to be transparent about the outcomes of individual user engagement activities. We think it would be helpful to reinforce user engagement by publishing a summary of responses to the 2019 user survey, explaining how the team intends to respond to feedback and improve the statistics.
  • The accessibility of the waste statistics landing pages and other waste pages on the SEPA website needs to be enhanced. These pages are not as user-friendly as they could be for non-expert users and information across pages is often inconsistent or out-of-date. For example, the publication calendar on the household waste data page confuses the ‘date when published’ with the reporting period and the links on the waste statistics regulations page are old or broken. All pages would benefit from a content review and refresh. Also, the users we spoke to told us the team may assume a higher level of user knowledge than is realistic. The team should consider the needs of different types of users when producing the bulletins, data tables and quality report to ensure that the information is accessible to non-expert users.
  • To make users aware of the coherence and comparability of the Scottish statistics with those produced by the other UK countries, we think it would be helpful to report and comment on trends in waste and recycling calculated using the harmonised UK ‘waste from households’ measure. The insight and relevance of the household waste bulletin could be enhanced by commenting on the extent to which the Scottish Government is meeting its waste and recycling targets.
  • The most recent edition of the quality report (covering 2017 data) was published in July 2019, a year and a half after the end of the reporting period. To ensure that quality and methods information is timely and supports interpretation of the statistics at the time they are published, it should be published alongside the statistics. The quality report should also explain how the carbon metric is calculated and describe the strengths and limitations of the approach used.
  • Published information about QA arrangements is limited. We recommend that the team apply our Quality Assurance of Administrative Data (QAAD) framework and publish a summary of its findings to assure users of the comprehensive QA process. It should map the flow of data through the system to help users understand the quality at all stages of the production process. We encourage the team to publish a summary of findings from the QA survey of local authorities as part of this documentation, to highlight the variation in approaches and data quality.
  • The list of recipients with pre-release access (PRA) should be published on the SEPA website and be reviewed regularly.

Local Authority Municipal Waste Management Statistics for Wales (Welsh Government)

Strengths

  • The statistics landing page is user-friendly and explains why waste data are collected and where they come from. All data are published on the StatsWales website, which allows users to produce and download customised data and tables and charts. Metadata, summarising the main limitations and caveats, are published alongside the data tables and charts to help users interpret the statistics.
  • The annual bulletin is well-structured and captures the main trends in the statistics. It highlights the extent to which the Welsh Government is meeting its waste and recycling targets and contains links to related Welsh statistics and waste statistics from the rest of the UK. The information boxes spread throughout the bulletin define the key terms and measures and explain potential inconsistencies in the data.
  • Changes in data quality, such as the improvement in data accuracy which led to a recent revision in the recycling rate, are explained in the quality report. The quality report also highlights potential sources of bias in the data due to, for example, the splitting out of household and non-household waste (which some local authorities collect together).

Areas for improvement

  • Apart from a 2017 user consultation on changes to outputs, we found little evidence of proactive engagement with users, particularly those outside government. To ensure that the statistics are relevant and insightful for all types of users, the team should aim to establish an ongoing dialogue with a range of users and involve them in the development of the statistics.
  • The insight of the bulletin commentary could be enhanced by discussing reasons for changes over time. The bulletin, quality report and statistics landing page would benefit from more-detailed information or links on waste policy in Wales and the rest of the UK, to help users contextualise the statistics.
  • We encourage the team to explore the feasibility of developing a metric like SEPA’s carbon metric, which measures the whole-life carbon impact of waste. Such a measure would add insight on waste’s contribution to climate change in Wales.
  • Three organisations are responsible for the management of waste data and engagement with data suppliers in Wales – the Welsh Government, Natural Resources Wales, and the Waste and Resources Action Programme (WRAP). The roles and responsibilities of each organisation, and how they work together to deliver high quality data, should be explained,
  • To support user understanding of the data source (WDF) and methods, we think it would be helpful to explain how the statistics are calculated from the data submitted by local authorities. This should cover the question format and how it has changed over time to enable collection of more granular information on end destinations of waste.
  • The quality report contains a basic description of QA arrangements, but the level of detail is not proportionate to the complexity of the data. For example, it does not cover the checks and validation carried out by local authorities. To reassure users about data quality, the team should produce more-thorough documentation that maps the flow of data through the system. It may like to consult DAERA’s Administrative Data Source Quality Report for an example of this.

Northern Ireland Local Authority Collected Municipal Waste Management Statistics (DAERA)

Strengths

  • We welcome that the team has applied the learning from our compliance check of the Northern Ireland June Agricultural Census statistics to other DAERA statistics, including the waste management statistics, to enhance their quality and value. Recent improvements have focused on making the statistics and data more accessible and reusable for a wide range of users through the development of new outputs (such as infographics, an interactive dashboard and a time series dataset), and improving and publishing more-detailed information about QA arrangements (see below). It demonstrates a commitment to continuous improvement.
  • We are pleased to see recent proactive engagement with external users of the statistics, for example, through a workshop in early 2020. This provided valuable information on how the statistics are used and feedback on the presentation of the statistics, which is being used to drive improvements. The team has also promoted the statistics via the DAERA statistics user group newsletter, created to inform users during the pandemic. We encourage the team to continue building its network of external users and involve them in the development of the statistics.
  • The annual bulletin is informative and engaging, providing a coherent overview of waste and recycling in Northern Ireland. It presents estimates of the recycling rate using both the Northern Ireland household waste definition and the harmonised UK measure (‘Waste from Households’) and helpfully compares the recycling rates between UK countries. It contains clear descriptions of relevant policies and detailed and impartial commentary on progress against waste and recycling targets.
  • The published quality information is comprehensive. In addition to the clear descriptions of data sources and methods in the annual bulletin, an Administrative Data Source Quality Report is published which discusses in depth the WDF system and the quality assurance arrangements at all stages of the production process, including the checks and validation carried out by local authorities, the Northern Ireland Environment Agency data control team and the statistics team.

Areas for improvement

  • To enhance the usefulness of the Administrative Data Source Quality Report, a process map illustrating the flow of data through the system could be added. To help users contextualise the data, key quality and methods information, including limitations and caveats, could be added to the interactive dashboard and datasets.
  • We encourage the team to explore the feasibility of developing a metric like SEPA’s carbon metric, which measures the whole-life carbon impact of waste. Such a measure would add insight on waste’s contribution to climate change in Northern Ireland.

Ed Humpherson to Stephanie Howarth: COVID-19 Infection Survey Results Publication Time

Dear Stephanie,

COVID-19 Infection Survey Results Publication Time

Earlier in the year we granted ONS an exemption from the Code of Practice for Statistics’ standard publication time of 9.30am to permit a later release time of noon each Friday for statistics from the COVID-19 infection survey.

I am happy to confirm that the same exemption applies to the statistics for Wales from this survey that you publish.

Yours sincerely

 

Ed Humpherson
Director General for Regulation

Related links:

Ed Humpherson to Eugene Mooney: COVID-19 Infection Survey Results Publication Time

Mark Pont to Glyn Jones: Welsh Index of Multiple Deprivation 2019 statistics

Dear Glyn 

 WELSH INDEX OF MULTIPLE DEPRIVATION 2019 STATISTICS  

 I am writing to you following our review of the Welsh Government’s Welsh Index of Multiple Deprivation 2019 (WIMD) statistics against the Code of Practice for StatisticsThe statistics were reviewed against the three pillars of the Code: trustworthiness, quality and value.  

WIMD statistics are an important tool for identifying the most disadvantaged areas and for supporting decisions about addressing local needs. They are widely used by central and local government and community organisations to target their services. The statistics have been considered as part of a wider review of the indices of deprivation statistics in Great Britainalongside our compliance checks of the statistics produced by the Scottish Government and the Ministry of Housing, Communities and Local Government (MHCLG) 

I am pleased to confirm that these statistics should continue to be designated as National Statistics. We found several positive examples in the way that Welsh Government produces and presents these statistics: 

  • The team has a clear understanding of the uses and users of the statistics. Welsh Government ran a user consultation and timing survey ahead of WIMD 2019 and published its planned developments in response, which generates trust in the statisticsThe frequency of WIMD is tailored to user need  the team told us that it can be a burden on local authorities and third sector organisations who use WIMD in their own analyses if WIMD is updated too regularly. Any methodological changes to the construction of WIMD between iterations are informed by an advisory group and other domain experts which the team said it had good relationships with. This further enhances the trustworthiness of the statistics. 
  • The statistical bulletin is engaging, clear and easy to follow. The pen pictures of areas in Wales are useful in drawing out insight from the statistics and help draw out the relevance to users. The technical report is comprehensive and sets out the process for constructing the WIMD in a clear and accessible way. The guidance document has been tailored to lesstechnical users and effectively summarises the key information from the technical report.  
  • The bulletin contains analysis of deeprooted deprivation which looks at areas that have remained in the top 50 most deprived for all WIMD iterations in the past 15 years. The team told us that it took inspiration from the Scottish Government to adopt this concept. The analysis of deep-rooted deprivation draws out valuable insight from the statistics and addresses the limitation of not being able to compare small areas and ranks between iterations of WIMD. 
  • The team spoke highly of its relationship with the other nations. The ‘four nations group’ meets regularly and works collaboratively to make guidance and presentation across the deprivation statistics more consistent. Welsh Government has been working with MHCLG, which extended its coverage for income and employment data to cover Wales in 2019, to develop new comparable analysis for these two domains.  

We have identified one area where we consider that improvements could further enhance the public value of the statistics:  

  • The ability to combine and compare indices of deprivation across the devolved nations continues to be an area of interest for some users. Each of the producers we spoke to said they deal with queries relating to this on a regular basis, despite their joint effort to set out in the statistical releases how the statistics can and can’t be used. Welsh Government, as part of the ‘four nations group’, should look to ensure that appropriate resource is devoted to developing updated UK-wide guidance and insight. 

Our Labour Market and Welfare team will continue to engage with you and your team in the coming months to follow up on the area that has been highlighted for improvement. We would like to thank the team for its engagement and cooperation throughout the review process.  

I am copying this letter to the responsible team: Nia Jones, Sue Leake and Samantha Collins.    

Yours sincerely 

 

Mark Pont 

Assessment Programme Lead  

 

Related Links

Mark Pont to Siobhan Carey: Northern Ireland Multiple Deprivation Measure

Mark Pont to Sandra Tudor: English Indices of Deprivation 2019 statistics

Mark Pont to Roger Halliday: Scottish Index of Multiple Deprivation 2020 statistics

 

Mark Pont to Glyn Jones: Compliance Check of the National Rough Sleeper Count in Wales

Dear Glyn

THE NATIONAL ROUGH SLEEPER COUNT IN WALES

We have recently conducted our review of the compliance of Welsh Government’s (WG) Rough Sleeper Count official statistics against the Code of Practice for Statistics.

While these statistics are not National Statistics, they are important official statistics valued by users and so we have made a number of recommendations to support your continued development of these statistics. We considered the Trustworthiness, Quality and Value of these statistics in relation to the Code and have appreciated the positive and constructive way that the team has engaged with us during our review, especially at this particularly challenging time.

We welcome the news that new management information is being collected on the numbers of homeless and rough sleepers in Wales being assisted into emergency accommodation since the start of the COVID-19 pandemic. We would like to recognise the positive steps being taken by your statisticians, working with other Welsh Government officials, to determine how this management information can be best used to complement the existing statistics. The plans for how this may develop over the coming months demonstrates real innovation within this area. We commend your team on the timeliness and transparency shown through the recent release of this management information which provides further insight into the homeless landscape, and we feel that this sets a good example of the processes needed to release data that is used publicly in ministerial statements accessibly and promptly.

Within the National Rough Sleeper count we found a range of positive features that demonstrate the trustworthiness, quality and value of the statistics:

  • Upfront guidance about the limitations of the single-night count and providing users with a range of factors that can influence the accuracy of the information and the steps taken to try to reduce the impact of the issues;
  • Having strong internal quality assurance processes within the team, some of which are documented within the release, gaining insights from policy colleagues during the quality assurance process, as well as having built-in validation checks within the data collection form itself;
  • Providing clarity and insight through the use of some good examples in the release of maps, tables and charts to present the data as well as providing some context around the data;
  • The team’s active engagement with topic experts to review and further develop the methods for measuring the rough sleeping population in Wales.

We identified some areas for improvement that would enhance the quality and value of the statistics:

  • Having greater oversight of the data collection methods used and information collected by Local Authorities (LAs), as well as seeking further assurances around how LAs ensure the accuracy of their data returns, will help enhance the understanding of the comparability of the data between the different areas and over time;
  • It is unclear what level of assurance is given on the quality of the data sources. To assure users of the level of quality of the data, information should be provided on this and how they were assessed. This will also help to further demonstrate your application of our required standards for the quality assurance of administrative data;
  • As the statistics are developed further, we welcome your ambitions to explore collecting case-level data and recommend exploring the collection of demographic characteristics of the rough sleeper population to add to the value for users and those working in WG policy areas. We are aware that the Welsh Local Government Association (WLGA) publishes its own analysis of the information collected during the two-week national rough sleeper count and so we would suggest considering building links with organisations such as WLGA with an interest in this area, to help deliver better insights into the rough sleeper landscape;
  • We recommend engaging with the Government Statistical Service (GSS) Good Practice Team to explore the potential for further improvements to these statistics, particularly in light of recent improvements that the team supported Ministry of Housing, Communities and Local Government’s (MHCLG) analysts make to their Rough Sleeping Snapshot in England statistics, including innovative new ways of releasing these data.

Thank you for engaging effectively with us during this review. We welcome the updates on your continued collaborative work as part of the Cross-Government Homelessness Statistics Working Group. With management information now being collected across the UK on the numbers of rough sleepers and homeless helped into emergency accommodation since the start of the pandemic, alongside other administrative sources, we look forward to seeing GSS statisticians work together to more fully illustrate the complexity of the overall UK rough sleeping and homelessness picture.

Our Housing, Planning and Local Services Domain Lead will continue to engage with your team on progress in the coming months and we would welcome a progress update from you upon the next publication of these statistics.

Please do not hesitate to get in touch if you would like to discuss any aspects of this letter further.

I am copying this letter to Sue Leake (Head of Education and Public Services Statistics); Luned Jones from the Housing Statistics team; and Lee Thomas from the Data Collection team.

 

Yours sincerely

Mark Pont

Assessment Programme Lead

Ed Humpherson to Glyn Jones: Monthly Indicators from the National Survey for Wales

Dear Glyn

Monthly Indicators from the National Survey for Wales

I am writing to endorse the approach you have taken to develop the National Survey Wales given the current COVID-19 situation. The primary purpose of the survey is to provide views of adults in Wales on a wide range of issues affecting them, which is key to monitor changes and make decisions. Your team has responded effectively to the COVID-19 pandemic in adapting the survey to produce monthly results using a different data collection method. I would like to congratulate everyone involved for their work to produce these valuable statistics in challenging circumstances and confirm that these statistics can be labelled as National Statistics as part of the National Survey suite of National Statistics outputs.

My team has conducted a rapid regulatory review of the published information. We have reviewed the extent to which they have been produced in accordance with the Code of Practice’s Trustworthiness, Quality and Value pillars, while taking account of the pressures you and your teams have faced to deliver timely statistics about an important topic. A summary of our findings is set out below and more detailed feedback has been provided to your team.

Value

  • The primary purpose of the statistics is to provide views of adults in Wales on a wide range of issues affecting them. The statistics allow changes to be monitored and meet policy needs.
  • Your team worked quickly to implement a different data collection method (from face-to-face to telephone) to produce these statistics.
  • The survey questionnaire has been adapted to include new questions on coronavirus across the different topic areas: shopping habits; mortgage payments; getting GP appointments; employment; hours worked; and childcare impacts.
  • You informed users of the change to the statistics through publishing an update on your website; users would benefit from this update forming part of the National Survey landing page to ensure it has a high level of visibility and the change is well understood.
  • We welcome the ongoing dialogue the survey team has had with the sponsored bodies that co-fund the survey (Sport Wales, Arts Council of Wales and Natural Resources Wales) and policy teams within Welsh Government to ensure that the survey results could feed into making important decisions. Your plans to extend this engagement will help develop the survey over the coming months, and ensure appropriate insights are delivered to a wider range of users.
  • The initial statistical report is clear, informative and easy for users to understand; we look forward to it evolving over the coming months as topics change and user needs are more fully understood.

Quality

  • The new questions have been developed more quickly than usual. Welsh Government experts in questionnaire design and testing are leading the development of the survey questions to overcome the fact that questions can’t be tested as extensively as would usually happen. Referring to other research – such as the COVID-related questions on the Office for National Statistics’ (ONS) opinions survey – is also helpful to build in quality. The further cognitive testing that you have planned will in time offer further reassurances to users about quality.
  • The survey contractor (ONS) has carried out a set of quality checks on the data. Alongside your own checks, this means they are produced to a level of quality that meets users’ needs.
  • It is exceptional that the survey has achieved a high response rate in its first month. We consider that your plan to reduce the contacted sample for June and July is sensible, and it is good that you have committed to monitor the response rate over time in order to adjust the sampling arrangements as needed during the constantly changing situation.
  • The quality report is thorough about methods, including explanations of strengths and limitations, confidence intervals and accuracy of the data that aid interpretation of the statistics.

Trustworthiness

We would like to thank the team for engaging with us during this review and we look forward to seeing these statistics develop as more data are collected on the impact of COVID-19.

Yours sincerely

Ed Humpherson

Director General for Regulation

Ed Humpherson to Glyn Jones: Provision in local authority settings during the Coronavirus (COVID-19) pandemic

Dear Glyn

Provision in local authority settings during the Coronavirus (COVID-19) pandemic

I am writing to endorse the new official statistics Welsh Government is about to publish. I would like congratulate everyone involved for their work to produce these timely and valuable statistics in challenging circumstances.

As discussed, I am also happy to confirm the exemption from the Code of Practice for Statistics’ standard publication time of 9.30am to permit a later release time of noon each Monday for these statistics. This will enable the statistics to be as timely as possible under the circumstances, and I welcome your endeavours to publish such timely information.

My team has conducted a rapid regulatory review of these statistics. We have reviewed the extent to which they have been produced in accordance with the Code of Practice’s Trustworthiness, Quality and Value pillars, while taking account of the pressures you and your teams have faced to deliver timely statistics about a rapidly evolving situation. A summary of our findings is set out below. More detailed feedback has been provided to your team.

Value

  • We welcome the rapid development and publication of these statistics so they can support understanding of the impact of COVID-19 on schools and pupils. We note that this is an evolving release and acknowledge your commitment to keep the statistics under review as the demand for new insights changes over coming weeks and months.
  • We support your decision to bring forward publication of headline figures from the January 2020 School Census to enable the numbers of pupils and staff to be put in context, and your assurances that you have been able to deliver robust headline estimates to a compressed timetable. We will keep the National Statistics designation of these statistics under review whilst the impact on users of the changes to the level of insight provided, and the restricted quality assurance is better understood.

Quality

  • Using the expertise of Data Cymru to quickly establish a system for collecting daily data from local authorities has established a firm footing for the collection of the data. The robust mechanisms that you have established for assuring the quality of the data, along with clear, prominent statements within the statistical bulletin about the source of the data and quality assurance that has taken place provide appropriate assurances about quality.

Trustworthiness

  • The decision to publish these statistics as a full statistical bulletin and your efforts to preannounce as soon as possible both offer transparency to users.
  • You have reprioritised resources quickly to enable people to work on this. It has required a considerable effort from many different teams, which is to be commended.

We look forward to seeing these statistics develop as circumstances change. As set out in the guidance on changes to statistical outputs you can include a statement in your release such as “These statistics have been produced quickly in response to developing world events. The Office for Statistics Regulation, on behalf of the UK Statistics Authority, has reviewed them against several key aspects of the Code of Practice for Statistics and regards them as consistent with the Code’s pillars of Trustworthiness, Quality and Value.”

I am copying this to Steve Hughes, head of schools statistics at the Welsh Government.

Yours sincerely

Ed Humpherson

Director General for Regulation

Response from Ed Humpherson to Glyn Jones, Welsh Government, on school expenditure

Dear Glyn,

As agreed, I am responding on behalf of Sir David Norgrove following the letter he received from Kirsty Williams AC/AM regarding use of estimates from NASUWT of the per pupil funding gap between England and Wales in debate within the Welsh Assembly.

We are aware that since 2012 the Welsh Government has not produced official statistics on the funding gap between England and Wales. You have explained this is largely due to the added complexity resulting from the roll-out of academies in England. However, you have worked with the Institute for Fiscal Studies (IFS) to develop its estimates of the funding gap.

We have considered the analysis carried out by the IFS and the way it has been reported in your official statistics. In your Local Authority Budgeted Expenditure on Schools: 2019-20 publication you comment on the IFS report, stating:

Whilst there still may be some issues in the comparability of the data at a detailed level, we worked with the IFS researchers to consider their methods and believe it is the most robust comparison that currently exists of trends in recent years.

We agree with your conclusion and found that the IFS analysis is clear on how the figures have been calculated.

We have been unable to comment on the approach taken by NASUWT. However, we are aware that you are working with NASUWT as it looks to revise its calculations. We encourage all organisations to consider the principles set out in the Code of Practice for Statistics and therefore would encourage NASUWT to publish details of its approaching to producing estimates and to consider voluntarily applying the code of practice for statistics to its revised calculations.

We consider it important that figures used in public debate are based on data which have transparent sources and can be verified. This is not currently the case for the NASUWT estimates.

Your sincerely

Ed Humpherson

Director General for Regulation

Related links:

Letter from Sir David Norgrove to Kirsty Williams AM

Letter from Kirsty Williams AM to Sir David Norgrove

Temporary exemption from Code of Practice for Welsh Government market sensitive publications

Dear Glyn,

Thank you for your letter of 26 March 2020. I am happy to confirm the exemption from the Code of Practice for Statistics to permit an earlier release time for the Welsh Government’s market sensitive
publications.

Your decision is sensible and proportionate in the circumstances.

Yours sincerely
Ed Humpherson
Director General for Regulation

Related Links:

Welsh Government to Ed Humpherson (March 2020)

Ed Humpherson to Scottish Government (March 2020)

Ed Humpherson to NISRA (March 2020)

Welsh Government request to temporary change to timing of key Welsh economic statistics

Dear Ed,

You will be aware of developments at the ONS in respect of their plans to publish market sensitive statistics at 7.00am.

Historically the Welsh Government has co-ordinated the publication of our labour market statistics with the corresponding release from the ONS and, if content, it would be our intention for that to continue.

Accordingly, I would be grateful if you could indicate your support for the Welsh Government moving the release time for our labour market publication Key Economic Statistics to 7.00am in line with contingency arrangements being implemented by the ONS. I appreciate that this represents a departure from the requirements of the Code, but consider it prudent as the releases complement one another and it would help ensure equal access for all.

We will of course keep this interim arrangement under review, in conjunction with colleagues at the ONS and in the other devolved authorities, and involve you as appropriate in any further developments.

Regards,

Glyn Jones

Chief Statistician

Ed Humpherson to Welsh Government (March 2020)

Scottish Government to Ed Humpherson (March 2020)

NISRA to Ed Humpherson (March 2020)

Devolved Labour Market Compliance Check

Dear all

STATISTICS ON THE LABOUR MARKET

I am writing to you following our recent review of the key Labour Market statistical reports for the devolved nations against the Code of Practice for Statistics. The statistics published by the Welsh Government, Scottish Government and Northern Ireland Statistics and Research Agency (NISRA) that we reviewed are:

The statistics have been considered as part of a wider review of labour market statistics, along with our assessment of UK employment and jobs statistics produced by the Office for National Statistics (ONS). I am pleased to confirm that all three labour market reports should continue to be designated as National Statistics.

Labour market statistics are key economic indicators which are used by a wide range of users and are subject to high user interest. This review focussed mainly on the quality and public value of the data, statistics and supporting information. We recognise that the outputs we have reviewed differ between the three countries in terms of purpose, each team’s access to the underlying data and the time window available to produce them. These factors have been considered as part of our recommendations.

In reviewing the labour market reports, we found examples of clear supporting methodology information, effective sign-posting and presentation of uncertainty, which we detail separately for each country later in this letter. We have identified some common areas for improvement across the three producer teams, which also correspond to areas for improvements we’ve highlighted in our assessment report of ONS’s employment and jobs statistics. The recommendations in this letter build on those we have made to ONS and we encourage all four countries to continue to work together to ensure that labour market statistics across the UK continue to provide the necessary insights. In order to improve the quality and public value of these statistics, the teams should:

  • Consider how the statistics can be better presented to help improve users’ understanding of how the labour market is changing over time. We found some examples where the key labour market measures are defined but the relationship between these groups of people (for example, the unemployed and the economically inactive) could be more clearly explained. We encourage the three producer teams to also work with ONS to develop a way to understand the flows of people into, out of and within the labour market.
  • Build on existing collaboration between all the producer teams, including ONS, to enhance the coherence of labour market statistics. We found strong evidence of effective cross-producer collaboration through regular meetings and steering groups. However, discrepancies between the Labour Force Survey (LFS) and the Annual Population Survey (APS) data currently present issues with coherence of data sources. Greater collaboration could support a consistent approach in presenting data from the LFS or APS respectively and in turn, lead to a better read-across between the different countries’ statistics. This will require leadership and coordination from ONS and is highlighted in our assessment report of employment and jobs statistics (para 2.5). It could also prove an effective part of finding a solution to address the concerns raised by Scottish Government and Welsh Government about the future funding for APS which is explained in the ONS assessment report (para 1.9).

Welsh Government

  • Key Economic Statistics is well presented and the narrative provides relevant context to the statistics. The section on ‘Key quality information’ is appropriately detailed and provides useful information on data sources and methods. The bulletin includes links to supporting documentation and StatsWales data tables throughout. This could be further improved by signposting relevant sections from the ‘Key quality information’ within the main bulk of the bulletin to help aid understanding.
  • The statistics team has presented confidence intervals for the LFS estimates which provides some context for the level of uncertainty associated with the data. However, these are relatively inaccessible, and the language used in the narrative presents the latest figures as absolute, for example “The employment rate in Wales was x%”. This is particularly important when comparing data across the four countries, where estimated differences are not always statistically significant. Welsh Government should improve the way uncertainty is reflected in the narrative, following the lead of ONS as recommended in our assessment report, for example referring to the latest figures as estimates.
  • We were pleased to hear from the statistics team about its plans to potentially introduce a new bulletin covering protected characteristics in the labour market, which is an area of interest identified in its 2012 user consultation. We encourage Welsh Government to keep published statistical development plans up to date and to ensure users are aware of progress being made against these developments.
  • The statistics team told us that some of the main users of Key Economic Statistics go straight to the data tables to find the information they require and not the bulletin by default. We encourage Welsh Government to find out how its users engage with the various statistical outputs to ensure they remain relevant to users.

Scottish Government

  • The Labour Market Trends bulletin is easy to follow and we welcome the improvements that have been made to the presentation of chart headings and footnotes. The bulletin signposts to the new quarterly youth APS publication, which was previously included in the monthly LFS bulletin, as well as a number of ONS pages relating to the LFS. To improve clarity of the statistics, Scottish Government should look to expand on the methodology information within the bulletin itself.
  • The statistics team told us that the process for producing the monthly bulletin has largely been automated to ensure the statistics can be published at the same time as the ONS release. As a result, the narrative in the bulletin focuses on the latest figures and the change on the previous quarter or year. We would encourage the statistics team to consider how to bring out more insight from the statistics to improve their public value.
  • We welcome the work Scottish Government and ONS are doing to ensure uncertainty is properly reflected in the bulletin, as part of our recommendation in the assessment of employment and jobs statistics, to help users understand the precision of estimates. For example, the statistics team should avoid presenting figures as absolute in the headline
    infographic such as “x% of people aged 16 to 64 were in employment” and instead refer to the latest figures as estimates.
  • We were pleased to hear from the statistics team about its ongoing user engagement and its plans for developing alternative products for accessing the data to complement the ScotGov open data platform and to meet a range of users’ accessibility needs. The statistics team also told us that its economic statistics development plan is being updated to cover a wider range of economic statistics than in previous years. We would encourage Scottish Government to increase the visibility of its developments by publishing updates and outcomes of user engagement to highlight the good work they are doing in this area and to keep users informed of their plans and progress.

Northern Ireland Statistics and Research Agency

  • NISRA’s Northern Ireland Labour Market report is engaging and comprehensive. The narrative is proportionate to the statistics and the ‘Further information’ section of the report is thorough in addressing the strengths, limitations and comparability of the data.
  • The presentation of uncertainty in the bulletin and supporting materials is effective, for example including statistical significance and confidence intervals of estimates, and we are pleased to hear the report is being used as a case study for presenting uncertainty by the Government Statistical Service’s Good Practice Team. To improve this further, NISRA should ensure comparisons between Northern Ireland and the UK also take into the account the level of uncertainty for the estimates.
  • The team carried out a user consultation of labour market statistics in 2019 and has published its planned developments in response. We would encourage NISRA to seek feedback on its progress against the developments and continue to collate feedback on its various statistical outputs.

We appreciate each of the teams’ willingness to engage with us in this review as well as the wider assessment process with ONS. We wish to thank them for taking on board our recommendations. Our labour market and welfare domain team will continue to engage with your teams over the coming months to discuss progress.

I am copying this letter to Melanie Brown (Welsh Government), Gayle Mackie (Scottish Government) and Cathryn Blair (NISRA), the lead statisticians.

Yours sincerely
Mark Pont
Assessment Programme Lead

Related Links:

Assessment Report: UK employment and jobs statistics (March 2020)

Assessment of the UK employment and jobs statistics (March 2020)