Mark Pont to Roger Halliday: Scottish House Condition Survey Statistics

Dear Roger

Scottish House Condition Survey Statistics

As you are aware, we recently completed our review of the compliance of the Scottish House Condition Survey (SHCS) statistics against the Code of Practice for Statistics. I am pleased to confirm that these statistics should continue to be designated as National Statistics.

We initiated this review following the public commitment we made in our 2020/21 Regulatory Work Programme to focus on statistics about key issues within Housing. We appreciate the positive and constructive way that the team engaged with us during the review, especially as we continue through these challenging times.

We found a range of positive features that demonstrate the trustworthiness, quality, and value of the statistics:

  • The Key Findings summary drawing users’ attention to the main messages in a clear and insightful way, along with comprehensive methodology notes which are updated annually and contain accessible information on the methods used and quality assurance approach.
  • Establishing a new process to publish ad hoc data requests and analyses, making SHCS data accessible to all users in an open and transparent way.
  • The survey data being published in Open Data format, and the team’s plans to make the survey data accessible for wider re-use and further analysis via the UK Data Service next year.
  • Future plans to develop new ad-hoc analyses based on deeper dives of the SHCS data on topics of particular user interest.
  • Your team’s regular engagement with users of the SHCS statistics through a range of means including the annual user days for the Scottish Household Survey (of which the SHCS is part) and communicating with users through the ScotStat
  • Regular engagement with the survey contractors, Ipsos MORI and the Building Research Establishment (BRE) throughout the year, including attending the interviewer and surveyor training, resulting in good working relationships and the ability to deal with any queries promptly.

We identified some areas for improvement that would enhance the quality and value of the statistics:

  • The ability to compare housing conditions across the different UK nations continues to be an area of interest for users. To assist users with doing this appropriately, we suggest that you provide signposting to other countries housing conditions data and relevant analyses within the SHCS statistics and contribute to cross-UK work to highlight the extent of the comparability and difference of these sources with the SHCS.
  • In order to support transparency regarding planned developments of the SHCS statistics, the team should publish details about its future development plans for the statistics and its overall approach to user engagement, so that users are clear about the available channels for them to feed in their views on such developments.
  • While the SHCS statistics are based primarily on survey sources, some administrative data are used for comparison purposes and as a small element of SHCS fuel poverty measurement. Where these administrative data sources are used, including those that are badged National Statistics, we expect producers to be assured of the quality of these data and its suitability for their use, and communicate this assurance to users. Our administrative data quality assurance guidance provides a framework to help producers to do this.

The suspension of all face-to-face surveys due to COVID-19, has created uncertainty around the future of SHCS data collection for both the social and physical survey. We discussed this in detail with your statisticians and heard of their potential plans in this area, including learning from, and sharing best practice with, other UK and Republic of Ireland statisticians through the Five Nations House Conditions Surveys meetings. We recommend that the team publish its plans soon so that users are informed of any future changes as soon as possible. Given the uncertainty and changing nature of events, we welcome that the team has agreed to keep in contact with us as these plans progress.

Please do not hesitate to get in touch if you would like to discuss any aspects of this letter further. I am copying this letter to Ailie Clarkson, Lead Statistician, and Claire Wood and Rucha Amin in the statistics team, and Adam Krawczyk, Head of Housing, Homelessness & Regeneration Analysis at the Scottish Government.

Yours sincerely

 

Mark Pont

Assessment Programme Lead

Mark Pont to Ken Roy (Defra), Ingrid Baber (SEPA), Stephanie Howarth (Welsh Government) and Conor McCormack (DAERA): Local Authority Collected Waste Management statistics

Dear all,

Local Authority Collected Waste Management statistics

We have reviewed compliance with the Code of Practice for Statistics of the four sets of waste statistics published by the Department for Environment, Food and Rural Affairs (Defra), the Scottish Environment Protection Agency (SEPA), the Welsh Government and the Northern Ireland Department of Agriculture, Environment and Rural Affairs (DAERA):

We reviewed the trustworthiness, quality and value of the statistics, including the coherence of the data source, methods and quality assurance (QA) arrangements, and the presentation of the statistics.

I am pleased to confirm that the statistics for England, Wales and Northern Ireland can continue to be designated as National Statistics. The statistics for Scotland are not currently designated as National Statistics; they are official statistics. We carried out a more comprehensive review of the Scottish statistics and gathered feedback from a small number of users to support their continued development towards National Statistics status.

The waste statistics are one of the key sets of environment statistics. They provide high-quality information about the volume of waste generated and recycled at the local authority level. As waste and recycling are primarily local issues, this level of granularity is essential. The statistics and data are also important at the national level, as they are used to monitor progress against waste and recycling targets. Issues relating to resources and waste, in particular, plastic pollution, continue to be a focus of public interest and debate on the environment and these waste statistics contribute to public understanding of these issues.

Our key findings and recommendations across all sets of statistics are presented below. Detailed findings by country are presented in Annex A. We encourage you, where relevant, to reflect on the findings for other countries, to learn from their approaches and practices.

We identified several shared strengths:

  • All local authorities submit waste data through the WasteDataFlow (WDF) Data are entered and processed in a standardised way, generating a robust data series on household waste for each country, although countries define and categorise waste slightly differently. Most countries have introduced a more flexible question format which allows local authorities to report more accurate and complete information on waste treatment and disposal.
  • All countries maintain strong, constructive relationships with data suppliers (local authorities). For instance, Defra and SEPA hold regular WDF user group meetings, which provide a forum for discussing data quality issues and developments and gathering feedback on the statistics from local authorities. It is good that you each took into consideration the resource pressures that local authorities have faced during the COVID-19 pandemic, supporting them where necessary (for example, by extending data submission deadlines), and that you have are adapting your QA checks to ensure that data quality remains high. We welcome this ongoing, proactive engagement with data suppliers, who are also key users of the statistics.
  • In general, the four countries collaborate closely when collecting waste data and producing waste statistics. For instance, all countries are represented on the WDF Operational Group, which meets annually to discuss methodology and processes as well as ad hoc WDF and local authority statistics issues. We also heard how statisticians worked together to develop the harmonised UK household waste measure (‘waste from households’) and are currently collaborating on developing a new waste tracking system (see below).
  • The ‘waste from households’ measure, which is used by UK and Devolved Governments to monitor and report compliance with the EU’s Waste Framework Directive recycling targets, allows users to compare recycling rates between UK countries. Defra presents a summary of this measure in a separate bulletin, UK Statistics on Waste, but some countries also comment on trends in the ‘waste from households’ measure in their bulletins and explain how it differs from country-specific household waste measures.
  • Defra, with the support of governments and regulators in Scotland, Wales and Northern Ireland, is leading the development of a new, innovative electronic waste tracking system. It aims to create a single point of reference for a waste transaction that can be used by all four countries. The new system will likely replace existing waste data collections, including WDF, and is expected to fill known information gaps (such as on what happens to waste when it moves from production to recovery or disposal, and flows between recycling facilities) and improve the coherence of waste definitions across the UK.

We also have some general recommendations and suggestions for improvements across all sets of statistics:

  • The credibility of recycling is dependent on final overseas destinations. There is still confusion among the public about where the UK’s recycling goes, with a significant quantity of some types of material being shipped abroad for recycling. To support public understanding of recycling, we recommend that waste statisticians in all four countries work together to produce an accompanying “explainer” on how waste is defined as having being recycled and how it counts towards the headline recycling rate. We think it would be helpful if it also summarises the main definitional differences in recycling rates between countries and explains why figures based on the harmonised measure do not always match up across different outputs, for instance, due to data revisions.
  • To enhance transparency around user engagement, and encourage further engagement from users, we recommend publishing a summary of existing and planned user engagement activities. It should explain how you are listening to users and acting on their views to develop the statistics. You may like to consult our review of user engagement in the Defra Group to inform your thinking in this area. We encourage you to share learning and insight from user engagement with waste statisticians in the other countries.
  • COVID has impacted many local authorities’ waste collections and waste data submissions. This is expected to have knock-on effects on the statistics, especially the volume of waste generated and recycled. It is important that you understand the effects of COVID on data quality and trends in the statistics and explain these to users.
  • The new accessibility regulations for public sector bodies came into force on 23 September 2020. You should ensure that your outputs, including the statistics bulletins, and statistics landing pages meet these requirements and reflect on any additional steps you could take to make your statistics more accessible to users, in line with the Code of Practice.

Our Agriculture, Energy and Environment team will continue to engage with you in the coming months to follow up on the highlighted areas for improvement. We thank you and your teams for your positive engagement throughout the review process.

I am copying this letter to Alex Clothier, Katherine Merrett and Andrew Woodend (Defra), Peter Ferrett (SEPA), Stuart Neil (Welsh Government), and Siobhan Carey (Northern Ireland Statistics and Research Agency).

Yours sincerely

Mark Pont

Assessment Programme Lead

 

Annex A: Key findings and recommendations by country

Local Authority Collected Waste Management Statistics for England (Defra)

Strengths

  • The team has a good understanding of the main users and uses of the statistics and interacts with users and other stakeholders in a range of ways. It has a strong working relationship with Defra policy colleagues, as evidenced by close collaboration during the response to the pandemic. It engages with industry bodies, waste management consultancies and local authorities through a resources and waste data group, which meets regularly to discuss use of waste data for infrastructure planning. The team also engages with organisations such as the Waste and Resources Action Programme (WRAP) on specific data issues. We welcome that different types of users are involved in the development of the data and statistics.
  • The bulletin is well-structured and engaging, with impartial commentary that explains the main differences and trends in the statistics. Relevant new content continues to be added to the bulletin, for example, on final waste destinations. The visualisations are effective and aid interpretation of the statistics. The maps are particularly informative, illustrating the variation in recycling rate, and how it has changed since the previous year, across England.
  • The methodology summary published alongside the statistics contains a good overview of the WDF system and highlights its main limitations. It explains the nature of changes in methods and their impact on the statistics. Definitions of the three reported measures of household waste are clear and highlighted prominently throughout the bulletin, supporting understanding for non-expert users. We are pleased that the team has been using our Quality Assurance of Administrative Data (QAAD) framework to review QA arrangements and welcome the level of detail published about the QA process.

Areas for improvement

  • To gain further insight into user needs and uses of the statistics, the team could be more proactive in its user engagement. For instance, it could attempt to identify and engage with potential users of the statistics, including academics.
  • To ensure that the information on the statistics landing pages is current and helpful, we recommend refreshing all landing pages. The pages would benefit from more information on what the waste data are and why they are collected. For instance, to highlight their wider relevance and importance, we suggest adding an overview of or links to the key waste policies in England, including the Resources and Waste Strategy and the 25 Year Environment Plan. The Defra air quality and emissions statistics landing page provides a good example of how to do this.
  • To enable users to easily explore the rich dataset and facilitate re-use of the data, the team may like to consider developing an interactive dashboard or data tool like those DAERA or SEPA have developed.
  • The UK Statistics on Waste bulletin highlights the extent to which England is meeting its waste and recycling targets. We think it would be helpful if the local authority collected waste management bulletin also commented on this aspect of the statistics.
  • The waste data collection process in England is more complex than in other UK countries due to the large number of local authorities and the involvement of a contractor (Jacobs). To help users understand the flow of data through the system, the team could add a process map that illustrates the different stages and QA arrangements.
  • We encourage the team to explore the feasibility of developing a metric like SEPA’s carbon metric, which measures the whole-life carbon impact of waste. Such a measure would add insight on waste’s contribution to climate change in England.

Household waste statistics for Scotland (Scottish Environment Protection Agency)

Strengths

  • The team engages effectively with users inside and outside government and employs a range of approaches to understand use and listen to users. It works closely with policy teams in SEPA and analysts in the Scottish Government and Zero Waste Scotland. It also engages regularly with academics and industry bodies through Scotland Waste Data Strategy activities. In 2019, the team conducted a user survey to better understand the users of the waste statistics, their views on the presentation and content of the statistics bulletins, and to identify gaps in their needs. We encourage the team to continue this proactive engagement to ensure that the statistics are relevant and insightful for all types of users.
  • The Waste Data Strategy hub on Scotland’s Environment web, maintained by the statistics team, is an excellent repository of waste-related information in Scotland. It brings together news and updates, documents from user events, guidance, case studies and data. The two interactive data tools on the website aid interpretation of the statistics by allowing users to produce customisable charts and data sets.
  • The users we spoke to told us they valued the statistics bulletins and datasets as the provide fixed points of reference which reduce the risk of misinterpretation. The bulletins give a good overview of the short- and medium-term trends in the statistics. The commentary is impartial and reasons for changes are discussed. Information about waste policies, regulations and targets in the quality report and on the SEPA website is clear and helpful.
  • The carbon metric, which measures the whole-life carbon impact of waste, is innovative, insightful and well-established, giving an indication of waste’s contribution to climate change in Scotland. We look forward to seeing the continued refinement of this valuable metric.
  • The QA process for the data is rigorous and continually improving. The users we spoke to recognise the effort the team invests in maintaining data quality. The team recently conducted a survey to better understand the QA principles, standards and checks undertaken by local authorities, which provided useful insight. It is also good that the team developed an automated data validation tool for local authorities to improve and standardise the level of QA.

Areas for improvement

  • It important to be transparent about the outcomes of individual user engagement activities. We think it would be helpful to reinforce user engagement by publishing a summary of responses to the 2019 user survey, explaining how the team intends to respond to feedback and improve the statistics.
  • The accessibility of the waste statistics landing pages and other waste pages on the SEPA website needs to be enhanced. These pages are not as user-friendly as they could be for non-expert users and information across pages is often inconsistent or out-of-date. For example, the publication calendar on the household waste data page confuses the ‘date when published’ with the reporting period and the links on the waste statistics regulations page are old or broken. All pages would benefit from a content review and refresh. Also, the users we spoke to told us the team may assume a higher level of user knowledge than is realistic. The team should consider the needs of different types of users when producing the bulletins, data tables and quality report to ensure that the information is accessible to non-expert users.
  • To make users aware of the coherence and comparability of the Scottish statistics with those produced by the other UK countries, we think it would be helpful to report and comment on trends in waste and recycling calculated using the harmonised UK ‘waste from households’ measure. The insight and relevance of the household waste bulletin could be enhanced by commenting on the extent to which the Scottish Government is meeting its waste and recycling targets.
  • The most recent edition of the quality report (covering 2017 data) was published in July 2019, a year and a half after the end of the reporting period. To ensure that quality and methods information is timely and supports interpretation of the statistics at the time they are published, it should be published alongside the statistics. The quality report should also explain how the carbon metric is calculated and describe the strengths and limitations of the approach used.
  • Published information about QA arrangements is limited. We recommend that the team apply our Quality Assurance of Administrative Data (QAAD) framework and publish a summary of its findings to assure users of the comprehensive QA process. It should map the flow of data through the system to help users understand the quality at all stages of the production process. We encourage the team to publish a summary of findings from the QA survey of local authorities as part of this documentation, to highlight the variation in approaches and data quality.
  • The list of recipients with pre-release access (PRA) should be published on the SEPA website and be reviewed regularly.

Local Authority Municipal Waste Management Statistics for Wales (Welsh Government)

Strengths

  • The statistics landing page is user-friendly and explains why waste data are collected and where they come from. All data are published on the StatsWales website, which allows users to produce and download customised data and tables and charts. Metadata, summarising the main limitations and caveats, are published alongside the data tables and charts to help users interpret the statistics.
  • The annual bulletin is well-structured and captures the main trends in the statistics. It highlights the extent to which the Welsh Government is meeting its waste and recycling targets and contains links to related Welsh statistics and waste statistics from the rest of the UK. The information boxes spread throughout the bulletin define the key terms and measures and explain potential inconsistencies in the data.
  • Changes in data quality, such as the improvement in data accuracy which led to a recent revision in the recycling rate, are explained in the quality report. The quality report also highlights potential sources of bias in the data due to, for example, the splitting out of household and non-household waste (which some local authorities collect together).

Areas for improvement

  • Apart from a 2017 user consultation on changes to outputs, we found little evidence of proactive engagement with users, particularly those outside government. To ensure that the statistics are relevant and insightful for all types of users, the team should aim to establish an ongoing dialogue with a range of users and involve them in the development of the statistics.
  • The insight of the bulletin commentary could be enhanced by discussing reasons for changes over time. The bulletin, quality report and statistics landing page would benefit from more-detailed information or links on waste policy in Wales and the rest of the UK, to help users contextualise the statistics.
  • We encourage the team to explore the feasibility of developing a metric like SEPA’s carbon metric, which measures the whole-life carbon impact of waste. Such a measure would add insight on waste’s contribution to climate change in Wales.
  • Three organisations are responsible for the management of waste data and engagement with data suppliers in Wales – the Welsh Government, Natural Resources Wales, and the Waste and Resources Action Programme (WRAP). The roles and responsibilities of each organisation, and how they work together to deliver high quality data, should be explained,
  • To support user understanding of the data source (WDF) and methods, we think it would be helpful to explain how the statistics are calculated from the data submitted by local authorities. This should cover the question format and how it has changed over time to enable collection of more granular information on end destinations of waste.
  • The quality report contains a basic description of QA arrangements, but the level of detail is not proportionate to the complexity of the data. For example, it does not cover the checks and validation carried out by local authorities. To reassure users about data quality, the team should produce more-thorough documentation that maps the flow of data through the system. It may like to consult DAERA’s Administrative Data Source Quality Report for an example of this.

Northern Ireland Local Authority Collected Municipal Waste Management Statistics (DAERA)

Strengths

  • We welcome that the team has applied the learning from our compliance check of the Northern Ireland June Agricultural Census statistics to other DAERA statistics, including the waste management statistics, to enhance their quality and value. Recent improvements have focused on making the statistics and data more accessible and reusable for a wide range of users through the development of new outputs (such as infographics, an interactive dashboard and a time series dataset), and improving and publishing more-detailed information about QA arrangements (see below). It demonstrates a commitment to continuous improvement.
  • We are pleased to see recent proactive engagement with external users of the statistics, for example, through a workshop in early 2020. This provided valuable information on how the statistics are used and feedback on the presentation of the statistics, which is being used to drive improvements. The team has also promoted the statistics via the DAERA statistics user group newsletter, created to inform users during the pandemic. We encourage the team to continue building its network of external users and involve them in the development of the statistics.
  • The annual bulletin is informative and engaging, providing a coherent overview of waste and recycling in Northern Ireland. It presents estimates of the recycling rate using both the Northern Ireland household waste definition and the harmonised UK measure (‘Waste from Households’) and helpfully compares the recycling rates between UK countries. It contains clear descriptions of relevant policies and detailed and impartial commentary on progress against waste and recycling targets.
  • The published quality information is comprehensive. In addition to the clear descriptions of data sources and methods in the annual bulletin, an Administrative Data Source Quality Report is published which discusses in depth the WDF system and the quality assurance arrangements at all stages of the production process, including the checks and validation carried out by local authorities, the Northern Ireland Environment Agency data control team and the statistics team.

Areas for improvement

  • To enhance the usefulness of the Administrative Data Source Quality Report, a process map illustrating the flow of data through the system could be added. To help users contextualise the data, key quality and methods information, including limitations and caveats, could be added to the interactive dashboard and datasets.
  • We encourage the team to explore the feasibility of developing a metric like SEPA’s carbon metric, which measures the whole-life carbon impact of waste. Such a measure would add insight on waste’s contribution to climate change in Northern Ireland.

Ed Humpherson to Roger Halliday: Use of COVID-19 prevalence rates by Scottish Government

Dear Roger

Use of COVID-19 prevalence rates by Scottish Government

On 3 July First Minister Nicola Sturgeon in her COVID-19 speech, claimed that the prevalence of the virus in Scotland is five times lower than it is in England. The sources used to underpin this claim have been difficult to identify. The explanation provided to my team at the Office for Statistics Regulation was not clear. You have now explained to us that the Scotland prevalence figure was sourced from the Scotland’s COVID-19: modelling the epidemic (issue no.6) 25 June and the England prevalence figure was sourced from modelling work done by the London School of Hygiene and Tropical Medicine, using a UK estimate as a proxy for England. The UK estimate for the dates in question was not published and was provided to you to allow for this comparison.  A UK prevalence figure is available on the London School of Hygiene and Tropical Medicine website  however this is a real-time report with an unclear update schedule.

As the UK/England prevalence rate was not available publicly, we understand that you then compared the upper prevalence rates published in Scotland’s COVID-19: modelling the epidemic (issue no.6) 25 June and the Office for National Statistics’ COVID-19 Infection Survey pilot: 25 June. This was done to corroborate the figures from the London School of Hygiene and Tropical Medicine and referred to externally as the other data were not publicly available.

When unpublished figures are quoted in the public domain, we expect that this information is shared with the media and the public in a way that promotes transparency and clarity. There are lessons to be learnt in this case, with different data sources being quoted to the media and to us. We expect that any figures used are appropriately sourced, explained and available in the public domain.

Furthermore, it is important to recognise that a comparison of COVID-19 prevalence rates is not straightforward. If it is to be undertaken, the results and the uncertainties should be communicated transparently. We do not think that the sources above allow for a quantified and uncaveated comparison of the kind that was made. In future if such comparisons are made, we would expect to see sources made publicly available and a clear explanation of the limitations and associated uncertainty.

The Office for Statistics Regulation will continue to monitor Scottish Government’s use of statistics and data.

Yours sincerely

 

Ed Humpherson

Director General for Regulation

 

Related Links

Miles Briggs MSP to Sir David Norgrove – Use of COVID-19 prevalence statistics by Scottish Government

Sir David Norgrove to Miles Briggs MSP – Use of COVID-19 prevalence statistics by Scottish Government

 

Mark Pont to Roger Halliday: Scottish Index of Multiple Deprivation 2020 statistics

Dear Roger  

SCOTTISH INDEX OF MULTIPLE DEPRIVATION 2020 STATISTICS  

 I am writing to you following our review of the Scottish Government’s Scottish Index of Multiple Deprivation 2020 (SIMD) statistics against the Code of Practice for StatisticsThe statistics were reviewed against the three pillars of the Code: trustworthiness, quality and value.  

SIMD statistics are an important tool for identifying the most disadvantaged areas and for supporting decisions about addressing local needs. They are widely used by central and local government and community organisations to target their services. The statistics have been considered as part of a wider review of the indices of deprivation statistics in Great Britainalongside our compliance checks of the statistics produced by the Welsh Government and the Ministry of Housing, Communities and Local Government (MHCLG) 

I am pleased to confirm that these statistics should continue to be designated as National Statistics, subject to the required actions that we describe later in this letter 

We found several positive examples in the way Scottish Government engages with others to produce the statistics: 

  • The statistical bulletin contains a mix of infographics, maps and high level commentary which draw out the key findings from the statistics. The bulletin uses data visualisation effectively to compare individual domains with the overall SIMD for regions, which helps illustrate to users that areas do not have to be deprived across all measures to experience deprivation.  
  • The Scottish Government has a clear understanding of the uses and users of the statistics and engages regularly with domain experts and key stakeholders to inform the development of the statistics. The frequency of SIMD is tailored to user need as the team told us that it can be a burden on local authorities and third sector organisations, who used SIMD in their own analyses, if SIMD is updated too regularly. 
  • The team has sought to bring out the public value of SIMD and its impact on local areas in Scotland, as part of the materials it publishes. The bulletin contains case studies from Scotland’s Regeneration Forum and the University of Glasgow’s Dumfries Campus, to demonstrate the relevance of the statistics to users.  
  • The team spoke highly of its relationship with the other nations. The ‘four nations group’ meets regularly and works collaboratively to make guidance and presentation across the deprivation statistics more consistent. The definition of deeprooted deprivation, introduced previously to SIMD, has been adopted by the Welsh Government as a result of this collaboration.  

We have identified several areas where we consider that some improvements would benefit users in understanding how SIMD is put together and would ensure that SIMD fully meets the requirements of the Code of Practice relating to trustworthiness and public value of the statistics:  

  • We commend the Scottish Government for seeking to contextualise SIMD by using case studies throughout its products and in particular, to demonstrate its relevance beyond looking at rural areas. However in some instances, the case studies appear to dominate the products and the data are somewhat lost. For example the SIMD Illustrated Story, in comic-book style, is the second link on the landing page before any technical information. While a valuable example of use, some of the content is about the area being illustrated – its potential, and recent developments, for example – and with such prominence risks the perception of the independence of the statistics and the statisticians. We consider that it would be better for this to have lower prominence on the website. 
  • The methodology report has not been updated for SIMD 2020 and the landing page currently directs users to technical guidance for SIMD 2012, despite a comprehensive methodology report having been produced for SIMD 2016. The 2012 guidance does not illustrate to users how the SIMD is constructed. Scottish Government should produce an update for SIMD 2020, bringing out the relevant elements of the 2012 and 2016 guidance as appropriate, and improve the signposting of information relating to SIMD.  
  • The team told us that it faced delays in receiving the income and employment data for SIMD 2020 as a result of needing to establish contacts with data suppliers and the relevant data management teams. To build resilience in the team, Scottish Government should look to document how the legal gateway for accessing the data was determined, and who was involved in this process, so that this can be referred to in the next iteration of SIMD.  
  • The ability to combine and compare indices of deprivation across the devolved nations continues to be an area of interest for some users. Each of the producers we spoke to said they deal with queries relating to this on a regular basis, despite their joint effort to set out in the statistical releases how the statistics can and can’t be used. Scottish Government, as part of the ‘four nations group’, should look to ensure that appropriate resource is devoted to developing updated UK-wide guidance and insight. 

Our Labour Market and Welfare team will continue to engage with you and your team in the coming months to follow up on areas that have been highlighted for improvement. In particular, we think it would be helpful if the improvements to the technical guidance and signposting of information were made by September 2020We would like to thank the team for its engagement and cooperation throughout the review process.  

I am copying this letter to Elizabeth Fraser, the responsible analyst.    

Yours sincerely 

 

Mark Pont 

Assessment Programme Lead  

 

Related Links

Mark Pont to Siobhan Carey: Northern Ireland Multiple Deprivation Measure

Mark Pont to Sandra Tudor: English Indices of Deprivation 2019 statistics

Mark Pont to Glyn Jones: Welsh Index of Multiple Deprivation 2019 statistics

Scott Heald and Roger Halliday response to Ed Humpherson: The use of unpublished statistics in Scottish Government news release

Dear Ed,

The use of unpublished statistics in Scottish Government press release

Thank you for your letter of 8 July relating to the use of an unpublished statistics on routine serology (antibody) testing in a Scottish Government news release on 23rd June. This was an oversight, for which we apologise. As you are aware, there is very high demand for statistics relating to COVID-19 and the statistical teams in Public Health Scotland and Scottish Government are producing a large volume of statistics every day. On this occasion, the fact this statistic had not been published was missed by the teams in both organisations.

Scottish Government and Public Health Scotland have been working closely during the COVID19 period to ensure we are presenting as complete and coherent a picture on COVID-19 in Scotland as early as possible. We do recognise the importance of the statistics on antibody testing and the need to get these into the public domain. These data are currently still in development but, given the public focus on these statistics, Public Health Scotland will release an initial adhoc report on antibody testing on Wednesday 15th July. This will be published alongside the Public Health Scotland weekly COVID-19 report.

If you have any further questions, please don’t hesitate to contact us.

Scott Heald & Roger Halliday

 

Related links

Ed Humpherson to Scott Heald and Roger Halliday: The use of unpublished statistics in Scottish Government news release

Ed Humpherson to Scott Heald and Roger Halliday: The use of unpublished statistics in Scottish Government news release

Dear Scott and Roger

The use of unpublished statistics in Scottish Government news release

On 23 June, a news release ‘COVID-19 antibody testing’ was published on the Scottish Government website detailing a letter written to NHS boards from the Chief Medical Officer Dr Gregor Smith about the use of serology testing for COVID-19 in Scotland. Within the news release, it reported that ‘so far, 4,431 antibody tests, for surveillance purposes, have been completed’. There is no reference to the data source of the figure quoted in the news release.

To date, there has been no publication of routine serology (antibody) testing in Scotland. As such, this figure cannot be verified. This is unacceptable for a figure of such importance used in a government news release.

We understand that wider plans are underway around the data collection and publication of serology information in Scotland. Nevertheless, our expectation, as set out in our public statement of 18 May, is that any data used publicly by Government should be published in an accessible form, with appropriate explanations of context and sources. Antibody testing data are of particular interest to the public and to ensure public confidence and equality of access, we urge you to publish the data quoted in the Scottish Government news release.

Yours sincerely

Ed Humpherson

Director General for Regulation

 

Related Links

Scott Heald and Roger Halliday response to Ed Humpherson: The use of unpublished statistics in Scottish Government news release

Compliance Check of Rural Scotland Key Facts

Dear Roger

 RURAL SCOTLAND KEY FACTS STATISTICS

 We recently conducted a review of the compliance of Scottish Government’s Rural Scotland Key Facts statistics against the Code of Practice for Statistics. We are pleased to confirm that these statistics can continue to be designated as National Statistics, subject to your team addressing the recommendations highlighted below.

We recognise the impact that key policy areas have on rural communities, and the importance of having high quality statistics that provide insight into rural Scotland and feed into mainstream policy development. We found various examples of good practice linked to the quality and value of the Rural Scotland Key Facts statistics. The Scottish Government’s Urban Rural Classification, which is revised every two years to maintain relevance, provides a standard definition of rural areas in Scotland and is applied to the most up-to-date data sources to produce the statistics. These provide a strong evidence base that allows policy areas specific to Scotland to be addressed. We commend the detailed, insightful commentary and accompanying charts and graphs which eases the interpretation of these statistics.

We have identified four areas where we consider your team could further enhance the trustworthiness, quality and value of these statistics:

  • Pre-release arrangements are not published on the Scottish Government website. Your team should record and publish the details of those granted pre-release access alongside clear justifications for access. We encourage the team to regularly review the list of recipients to keep it to a minimum.
  • A wide range of well-established data sources are used to compile the Rural Scotland Key Facts statistics. However, further information on the nature and limitations of these data sources, including changes to data sources, would help users better understand the quality of the statistics. We also encourage your team to clearly label data sources designated as National Statistics throughout the bulletin to ensure users are clear on the quality and value of each statistic.
  • We recognise that coherence of the Urban Rural Classifications across the UK is not possible due to the unique nature of each country’s population structure and policy needs. Within Scotland there are differing variations of the Classification that are being developed and used by other organisations to help answer different questions about rural Scotland. We encourage your team to continue to engage with these organisations and, where appropriate, to signpost different Classification variations within the publication to raise awareness of the different ways of defining rural areas in Scotland. This will ensure that the statistical picture is as clear as possible to all users.
  • Generally, the bulletin includes no indication of the level of uncertainty around the estimates, such as confidence intervals. We understand that the data suppliers quality assure the data and produce the estimates (by applying the Classification to their data) on the Scottish Government’s behalf. Where possible, we encourage your team to work with the data suppliers to enable appropriate descriptions of the level of uncertainty. Integrating this information with the bulletin would clarify to users that the statistics are estimates and help them interpret trends in the statistics.

We would like to thank your team for their positive engagement throughout this review process.

Our Agriculture, Energy and Environment domain lead, Job de Roij, looks forward to continuing to engage with your team on these and related statistics.

I am copying this letter to Neil Henderson and Stephen Smith, the responsible statisticians, and Alastair McAlpine, the Senior Statistician for Agricultural Statistics, at the Scottish Government.

Yours sincerely

Mark Pont

Assessment Programme Lead

 

Temporary exemption from Code of Practice for Scottish Government market sensitive publications

Dear Roger,

Thank you for your letter of 26 March 2020. I am happy to confirm the exemption from the Code of Practice for Statistics to permit an earlier release time for the Scottish Government’s market sensitive publications

Your decision is sensible and proportionate in the circumstances.

Yours sincerely

Ed Humpherson
Director General for Regulation

 

Related Links:

Roger Halliday to Ed Humpherson (March 2020)

Ed Humpherson to Welsh Government (March 2020)

Ed Humpherson to NISRA (March 2020)

Scottish Government request to temporary change to timing of key Scottish economic statistics

Dear Ed

Proposed temporary change to timing of key Scottish economic statistics

You will be aware of developments at the ONS in respect of their plans to publish market sensitive statistics at 7.00 am.

Historically the Scottish Government has co-ordinated the publication of our labour market statistics with the corresponding release from the ONS and, if content, it would be our intention for that to continue.

Accordingly, I would be grateful if you could indicate your support for the Scottish Government moving the release time for our labour market publication Key Economic Statistics to 7.00 am in line with contingency arrangements being implemented by the ONS and in-line with the other Devolved Administrations. I appreciate that this represents a departure from the requirements of the Code, but consider it prudent as the releases complement one another and it would help ensure equal access for all.

We will of course keep this interim arrangement under review, in conjunction with colleagues at the ONS and in the other devolved authorities, and involve you as appropriate in any further developments.

Best wishes

Roger Halliday
Chief Statistician and Data Officer

Related Links:

Ed Humpherson to Roger Halliday (Scottish Government) (March 2020)

Welsh Government to Ed Humpherson (March 2020)

NISRA to Ed Humpherson (March 2020)

Devolved Labour Market Compliance Check

Dear all

STATISTICS ON THE LABOUR MARKET

I am writing to you following our recent review of the key Labour Market statistical reports for the devolved nations against the Code of Practice for Statistics. The statistics published by the Welsh Government, Scottish Government and Northern Ireland Statistics and Research Agency (NISRA) that we reviewed are:

The statistics have been considered as part of a wider review of labour market statistics, along with our assessment of UK employment and jobs statistics produced by the Office for National Statistics (ONS). I am pleased to confirm that all three labour market reports should continue to be designated as National Statistics.

Labour market statistics are key economic indicators which are used by a wide range of users and are subject to high user interest. This review focussed mainly on the quality and public value of the data, statistics and supporting information. We recognise that the outputs we have reviewed differ between the three countries in terms of purpose, each team’s access to the underlying data and the time window available to produce them. These factors have been considered as part of our recommendations.

In reviewing the labour market reports, we found examples of clear supporting methodology information, effective sign-posting and presentation of uncertainty, which we detail separately for each country later in this letter. We have identified some common areas for improvement across the three producer teams, which also correspond to areas for improvements we’ve highlighted in our assessment report of ONS’s employment and jobs statistics. The recommendations in this letter build on those we have made to ONS and we encourage all four countries to continue to work together to ensure that labour market statistics across the UK continue to provide the necessary insights. In order to improve the quality and public value of these statistics, the teams should:

  • Consider how the statistics can be better presented to help improve users’ understanding of how the labour market is changing over time. We found some examples where the key labour market measures are defined but the relationship between these groups of people (for example, the unemployed and the economically inactive) could be more clearly explained. We encourage the three producer teams to also work with ONS to develop a way to understand the flows of people into, out of and within the labour market.
  • Build on existing collaboration between all the producer teams, including ONS, to enhance the coherence of labour market statistics. We found strong evidence of effective cross-producer collaboration through regular meetings and steering groups. However, discrepancies between the Labour Force Survey (LFS) and the Annual Population Survey (APS) data currently present issues with coherence of data sources. Greater collaboration could support a consistent approach in presenting data from the LFS or APS respectively and in turn, lead to a better read-across between the different countries’ statistics. This will require leadership and coordination from ONS and is highlighted in our assessment report of employment and jobs statistics (para 2.5). It could also prove an effective part of finding a solution to address the concerns raised by Scottish Government and Welsh Government about the future funding for APS which is explained in the ONS assessment report (para 1.9).

Welsh Government

  • Key Economic Statistics is well presented and the narrative provides relevant context to the statistics. The section on ‘Key quality information’ is appropriately detailed and provides useful information on data sources and methods. The bulletin includes links to supporting documentation and StatsWales data tables throughout. This could be further improved by signposting relevant sections from the ‘Key quality information’ within the main bulk of the bulletin to help aid understanding.
  • The statistics team has presented confidence intervals for the LFS estimates which provides some context for the level of uncertainty associated with the data. However, these are relatively inaccessible, and the language used in the narrative presents the latest figures as absolute, for example “The employment rate in Wales was x%”. This is particularly important when comparing data across the four countries, where estimated differences are not always statistically significant. Welsh Government should improve the way uncertainty is reflected in the narrative, following the lead of ONS as recommended in our assessment report, for example referring to the latest figures as estimates.
  • We were pleased to hear from the statistics team about its plans to potentially introduce a new bulletin covering protected characteristics in the labour market, which is an area of interest identified in its 2012 user consultation. We encourage Welsh Government to keep published statistical development plans up to date and to ensure users are aware of progress being made against these developments.
  • The statistics team told us that some of the main users of Key Economic Statistics go straight to the data tables to find the information they require and not the bulletin by default. We encourage Welsh Government to find out how its users engage with the various statistical outputs to ensure they remain relevant to users.

Scottish Government

  • The Labour Market Trends bulletin is easy to follow and we welcome the improvements that have been made to the presentation of chart headings and footnotes. The bulletin signposts to the new quarterly youth APS publication, which was previously included in the monthly LFS bulletin, as well as a number of ONS pages relating to the LFS. To improve clarity of the statistics, Scottish Government should look to expand on the methodology information within the bulletin itself.
  • The statistics team told us that the process for producing the monthly bulletin has largely been automated to ensure the statistics can be published at the same time as the ONS release. As a result, the narrative in the bulletin focuses on the latest figures and the change on the previous quarter or year. We would encourage the statistics team to consider how to bring out more insight from the statistics to improve their public value.
  • We welcome the work Scottish Government and ONS are doing to ensure uncertainty is properly reflected in the bulletin, as part of our recommendation in the assessment of employment and jobs statistics, to help users understand the precision of estimates. For example, the statistics team should avoid presenting figures as absolute in the headline
    infographic such as “x% of people aged 16 to 64 were in employment” and instead refer to the latest figures as estimates.
  • We were pleased to hear from the statistics team about its ongoing user engagement and its plans for developing alternative products for accessing the data to complement the ScotGov open data platform and to meet a range of users’ accessibility needs. The statistics team also told us that its economic statistics development plan is being updated to cover a wider range of economic statistics than in previous years. We would encourage Scottish Government to increase the visibility of its developments by publishing updates and outcomes of user engagement to highlight the good work they are doing in this area and to keep users informed of their plans and progress.

Northern Ireland Statistics and Research Agency

  • NISRA’s Northern Ireland Labour Market report is engaging and comprehensive. The narrative is proportionate to the statistics and the ‘Further information’ section of the report is thorough in addressing the strengths, limitations and comparability of the data.
  • The presentation of uncertainty in the bulletin and supporting materials is effective, for example including statistical significance and confidence intervals of estimates, and we are pleased to hear the report is being used as a case study for presenting uncertainty by the Government Statistical Service’s Good Practice Team. To improve this further, NISRA should ensure comparisons between Northern Ireland and the UK also take into the account the level of uncertainty for the estimates.
  • The team carried out a user consultation of labour market statistics in 2019 and has published its planned developments in response. We would encourage NISRA to seek feedback on its progress against the developments and continue to collate feedback on its various statistical outputs.

We appreciate each of the teams’ willingness to engage with us in this review as well as the wider assessment process with ONS. We wish to thank them for taking on board our recommendations. Our labour market and welfare domain team will continue to engage with your teams over the coming months to discuss progress.

I am copying this letter to Melanie Brown (Welsh Government), Gayle Mackie (Scottish Government) and Cathryn Blair (NISRA), the lead statisticians.

Yours sincerely
Mark Pont
Assessment Programme Lead

Related Links:

Assessment Report: UK employment and jobs statistics (March 2020)

Assessment of the UK employment and jobs statistics (March 2020)