Scott Heald response to Ed Humpherson: Presentation of findings from the Discharges from NHSScotland hospitals to care homes report

Dear Ed

Presentation of findings from the Discharges from NHSScotland hospitals to care homes report

 Thank you for your letter of 14 January 2021 relating to the above report.  As you know, Public Health Scotland is building on the good practice in the publication of statistics from ISD Scotland, one of its constituent parts, and I’m pleased that you have recognised the team’s efforts to present the information in the report in line with the Code of Practice for Statistics.   This report was produced by experts from within Public Health Scotland and Universities of Edinburgh and Glasgow.

Your specific pointers on the complex statistics presented in tables 10 and 11 are helpful.  We recognise that the analysis undertaken in section 2 of the report is complex and therefore may be confusing for some readers.  Given this complexity and the high profile nature of this report, the report team were involved in a number of briefings for key stakeholders, including the media, to walk them through our results and to give the opportunity to ask questions.  We have also received direct feedback from users of the report too.  Our aim is to be as open and transparent about the analysis and conclusions as possible, and your comments and separate feedback from users (through the briefing sessions and direct correspondence) will enable us to provide even greater clarity for readers in interpreting the statistics in subsequent iterations of the report.

The team are currently working on further analyses which we intend to publish as an update to the existing report – we will pre-announce publication dates in due course.  At the same time, we will take the opportunity to update the existing sections in the report to take on board your comments and ensure all aspects of the analyses undertaken are explained in an understandable way for all users.

The team would also be happy to engage directly with the individuals who have contacted you so that we can talk through their points.

If you have any further questions about this report, please don’t hesitate to contact me.

Many thanks,


Scott Heald

Head of Profession for Statistics, Public Health Scotland.


Cc   Dr Jennifer Burton, University of Glasgow

Professor Bruce Guthrie, University of Edinburgh

Fiona Mackenzie, Public Health Scotland.


Related links:

Ed Humpherson to Scott Heald: Presentation of findings from the Discharges from NHSScotland hospitals to care homes report

Ed Humpherson to Scott Heald: Presentation of findings from the Discharges from NHSScotland hospitals to care homes report

Dear Scott

Presentation of findings from the Discharges from NHSScotland hospitals to care homes report

On 28 October, Public Health Scotland (PHS) published the report ‘Discharges from NHSScotland hospitals to care homes between 1 March and 31 May 2020’. This was commissioned by the Cabinet Secretary for Health and Sport on 18 August and you collaborated with experts from the University of Glasgow and University of Edinburgh in the production of this report.

As you are already aware, we have received correspondence which includes a query around the presentation of the findings in the report. I would like to thank you and your team for your prompt and positive engagement with us on this.

This is a management information publication and we are encouraged to see that many of the principles of the Code of Practice for Statistics have been adopted by you in the publication of this report. For example, the publication date of the report was pre-announced via the PHS statistical release calendar, and a pre-release access list was published. A comprehensive methodology note was published which includes the data sources used, the level of quality assurance and data completeness. This is of particular importance, given the data challenges you faced for the research.

We understand that the methods used were based upon a similar study by Public Health Wales. We have reviewed the communication of findings in consultation with method experts from the Office for National Statistics (ONS). We consider the report is clear, thorough and transparent about the limitations and quality of the dataset and we recognise that the data and analyses undertaken were complex.

When reporting statistical analyses, the uncertainty of estimates and confidence intervals should be communicated in a measured and clear way. Whilst we understand that communicating complex statistical messaging can be challenging, we consider there are lessons to be learnt from this case and expect the improvement points (Annex) to be addressed in the presentation of any future statistical analysis of the care home data.

Yours sincerely


Ed Humpherson

Director General for Regulation

Communicating uncertainty

Often in analyses and reports of this nature, the main challenge is how best to express the strength of evidence and how to communicate statistical uncertainty. Whilst we recognise the difficulties in expressing this clearly, we consider there are some sections of the report that could be confusing to readers.

The hazard ratios in tables 10 and 11 show that while there is a statistical relationship between hospital discharge and care home outbreak in the univariate analyses, this relationship ceases to be significant in the adjusted models. This is clearly explained in the report, but some of the discussion of the uncertainty around the estimates led to the report feeling a little inconsistent in its messaging. For example, on page 39, where the report outlines that the estimated risk of hospital discharge is not statistically significant, and then proceeds to detail a best estimate of risk figure. Greater clarity and consistency with explanations would assist the reader to understand the findings of the statistical modelling.

Specific feedback on analysis of associations between any hospital discharge and outbreak (table 10)

The adjusted hazard ratio when looking at discharge compared to no discharge is 1.21 with confidence intervals of 0.94-1.54. Although this is not statistically significant, the fact that the lower confidence interval is close to 1 means that this is marginal for this level of confidence. The section on interpreting table 10 states clearly and in bold that “hospital discharge was not statistically significantly associated with care home outbreaks (adjusted HR 1.21)”.

The conclusion section for this table then goes on to acknowledge that “the best estimate of the hazard ratio for hospital discharge is >1 and the confidence interval in the adjusted analysis is relatively wide. We therefore cannot statistically exclude the presence of a small risk from hospital discharge”. 

While it is good to see this discussion of uncertainty, this sentence feels quite technical and perhaps harder for a less experienced user to understand. It might also have been helpful to include this point in the section on interpreting table 10, alongside the statement of the non-significant finding, rather than in the conclusion. The way the information is presented in the report gives too much emphasis to the non-significant finding and not enough to the uncertainty. Presenting all of the information together (rather than under separate “interpreting table 10” and “conclusion” headers) would allow for a more balanced overall discussion of the statistical finding and the uncertainty around it.

Specific feedback on analysis of associations between different types of hospital discharge and outbreaks (table 11)

When looking at the different types of discharge, we see adjusted hazard ratios of 1.00 for tested negative, 1.27 for untested and 1.45 for tested positive. Although the confidence intervals again suggest these findings are not significant, the observed ‘dose-response’ pattern in the adjusted hazard ratios is consistent with a causal relationship between positivity and outbreak. Given the sensitivity of the care home setting during this pandemic, and the likely uses of the evidence from this analysis, some users may have benefited from additional discussion of this in the report.

Similar to table 10, the statement made in the section on interpreting the findings for table 11 is that none of the differences are statistically significant. The conclusion for table 11 then goes on to state: “The analysis does not find statistical evidence that hospital discharges of any kind were associated with care home outbreaks. However, our certainty about the three types of hospital discharge defined by testing status varies.”

The first sentence could be read as not finding evidence for each and all of the types of discharge, but that is inconsistent with the second sentence.

The conclusion section for table 11 then goes on to provide some explanation of the uncertainty by stating that there may be a small risk where a person was untested and a moderate to large risk where they tested positive. This explanation was clearer and easier to understand than that for table 10. Again though, it would have been clearer if both the non-significant finding and the uncertainty around it were discussed in the section on interpreting the findings, rather than under separate headings.

Confidence intervals

Some of the confidence intervals around the estimates were very wide. In the report, there is some explanation around the uncertainty of wide confidence intervals and how caution is needed in interpreting them. In the section on table 11, however, you draw attention to the upper bound of a confidence interval of 374% and point to this as evidence of an association between a positive test and an outbreak. This seems more of a reflection of the sample size and the lack of statistical power – having less data would tend to make the upper bound of the confidence interval higher, but would not mean that the evidence was stronger. Given the importance of communicating uncertainty clearly, we encourage you to be careful in your discussion of confidence intervals and be clear where these indicate that we should be less confident overall in the robustness of the findings.

Impact of methodological choice on findings

A clear rationale is provided in the report as to why the estimates need to be adjusted for care home size: “Larger homes will receive more discharges, and are also more likely to be services for older people, provide nursing care and be privately owned. These relationships between different care home characteristics means that simple comparison by single characteristics may be misleading.”

However, it is possible that there may be collinearity between hospital discharges and care home size, and the degree to which the model estimates may be affected by this was not covered in the report. It would be helpful for users to have more detailed explanation of how the possible relationship between discharges and care home size might have impacted on the findings.

Use of charts

The report makes good use of tables to display both univariate and adjusted hazard ratios along with their associated confidence intervals. This is very helpful and clear but of more use to a more experienced reader. Using charts in addition to the tables would enable you to present the information in a more visual way that might be easier for less expert users to understand.

Links to further helpful information

The following resources available on the GSS website may be of interest for any future presentation of the care home data.

The Good Practice Team will also carry out reviews of statistical publications. Their page on communicating statistics on the GSS website provides further information on this.


Related links:

Scott Heald response to Ed Humpherson: Presentation of findings from the Discharges from NHSScotland hospitals to care homes report


Mark Pont to Rob Kent-Smith: Experimental Estimates of GVA using Double Deflation

Dear Rob

Experimental Estimates of GVA (Gross Value Added) using Double Deflation

I am writing to you following our review of ONS’s recent publication: Producing an alternate approach to GDP using experimental estimates of GVA. These new experimental estimates of GVA represent a significant step forward towards improving the UK’s National Accounts framework. Implementing double deflation should minimise industry-level bias in GVA statistics and, hence, make them more accurate and valuable. We decided to conduct this review in order to evaluate progress in the development of these estimates and to identify where users can expect further enhancement of the statistics going forward.

Producing volume estimates of GVA using a double deflation method is widely regarded as international best practise for calculating estimates of real-terms Gross Domestic Product (GDP(P)) and has been a key recommendation of several reviews of economic statistics in recent years.

We welcome the positive progress that has been made towards developing a double deflation approach to National Accounting and would particularly highlight the following:

  • There is a high degree of transparency regarding both the methodology and the development plans for these statistics. ONS has also published several iterations of a deflator strategy over the last few years, which further enhances user understanding of the way the estimates are produced
  • The background and context to the statistics are explained at a level that is appropriate for the kinds of people who will be engaging with these statistics, who will be largely experienced, technical users
  • The publication is clear about the experimental nature of these statistics and their current limitations
  • There has been good engagement with user groups throughout the development of these statistics. Efforts to produce an internal ‘deflator dashboard’ and share this with some users, in order to facilitate understanding of deflator strategy, are particularly commendable. Engaging with users in this way is likely to increase their understanding of the methods behind these estimates, and result in more meaningful conversations between producers and users going forward

As you continue to develop these statistics, the following points will be amongst the most important to consider:

  1. Continuing work alongside the ONS team producing Services Producer Price Indices (SPPIs), to identify what changes to SPPIs are required in order to make them suitable for use in the National Accounts framework, whilst also allowing them to remain valuable price indices in their own right. SPPIs can be a more accurate deflator of intermediate consumption than industry deflators, as they are likely to be more reflective of the price paid by industry. Currently 49 of the approximately 200 SPPIs are being used in the National Accounts and consideration should continue to be given as to which price indices will provide the best possible deflator for the respective consumption.
  2. Consider how best to communicate to users the greater volatility associated with double deflated estimates of GVA in comparison to figures produced using single deflation. We are pleased to hear that you are already endeavouring to understand the volatility of these estimates by engaging with international comparators, and would encourage you to share these findings with users once these have been gathered
  3. Consider how best to alert users of instances where a price index and a deflator differ. We understand that this will become a more common occurrence in the future, and it is important to ensure that users understand why this can happen and the resulting consequences for the statistics they use
  4. Consider how best to communicate the potential impacts of double deflation on labour productivity before the new approach is fully adopted into the UK National Accounts. The new framework for volume estimates of GVA should produce better estimates of productivity analysis at the industry level. Given the interest in the UK’s productivity puzzle amongst users of productivity statistics, there is likely to be considerable interest in the potential impacts on UK productivity from the implementation of the new framework for double deflation and users should be made clear about the implications of these.

I’d like to thank you and your team for your positive engagement on this review. We have been very pleased to see the progress you have made with these double-deflated estimates, and look forward to seeing the further developments made going forward. I am copying this letter to Darren Morgan, Director of Economics Statistics Development at ONS.

Yours sincerely


Mark Pont

Assessment Programme Lead

Ed Humpherson to Deborah Lyness: NISRA’s weekly deaths statistics

This letter was updated on 22 December at 3pm to include the date of publication of the statistics (23 December)

Dear Deborah

Thank you for contacting my team about an exemption from the Code of Practice for Statistics to permit a later release time for NISRA’s weekly deaths statistics for the week ending 18 December 2020. You propose to publish on 23 December to avoid publication of the statistics on Christmas Day.

Given the circumstances for publishing this release earlier than usual, I am happy to confirm exemption from the standard publication time of 9.30am to permit a later release time of 4pm for these statistics.

Yours sincerely


Ed Humpherson

Director General for Regulation

Ed Humpherson to Iain Bell: Communicating models and methods to users of the Coronavirus (COVID-19) Infection Survey

Dear Iain

Communicating models and methods to users of the Coronavirus (COVID-19) Infection Survey

Thank you to you and your team for engaging with us as we investigated concerns about communication of the methods used to produce the estimates of the incidence rate of COVID-19 from the Coronavirus (COVID-19) Infection Survey. These have centred around the availability of clear information about methods and quality that meets the needs of all types of users, and developing an ongoing positive dialogue with users.

Estimates of COVID-19 positivity and incidence rates are widely used and play a vital role in informing the government response to the pandemic. These statistics are unusual in their complexity, and their wide user base, so it is important that the modelling and data are presented in a clear and accessible way to this broad range of users to meet their needs. The challenge is that the needs of general users must be met, by making the explanations accessible, and at the same time, the needs of technical users must be met, by providing adequate detail about the methods used.

The additional information published in the bulletin on 11 December regarding the calculation of incidence rates demonstrates ONS’s commitment to improving its communication of the underlying methods. However, it is clear that the presentation of the statistics and descriptions of the methods needs to be much clearer still, both for general and technical users. For example, the distinction between the ‘official reported estimate’ and ‘modelled estimate’ of the incidence rate remains a source of confusion for users, particularly as the estimates are presented alongside each other.

Due to the complexity of the statistical models, it is also important that detailed information is accessible to technical users to allow them to evaluate the methods used and to fully understand the data and the way that estimates change due to the modelling. The pre-print academic paper (which has yet to be peer-reviewed), submitted by your academic partners, goes into considerable detail on the methods used. However, it would be helpful to reflect upon recent concerns raised by users and work with users to produce clear and accessible summaries of the rationale for the methodology choices made. This should include information about the assumptions made and how the performance of the models is evaluated and improved.

The COVID-19 Infection Survey continues to deliver valuable insight about the pandemic, at speed and to a wide range of users, and we support its ongoing development. We appreciate the resource constraints and pressure that you are under, but it would benefit ONS to demonstrate more clearly that it is open to engagement with users. This would be achieved by sharing information about the user engagement you are already undertaking and engaging more openly with a wide range of users, including those with new requests or questions.

We will follow up the points raised in this letter through our in-depth review of the survey.

Yours sincerely


Ed Humpherson

Director General for Regulation

Ed Humpherson to Professor Roger Halliday: Proportion of care homes in Scotland allowing and enabling indoor visiting

Dear Roger  

Proportion of care homes in Scotland allowing and enabling indoor visiting

I am writing about a statement made by the First Minister in Scottish Parliament on 17 September 2020. In response to an oral question from Willie Rennie MSP about the tragic situation of a constituent whose mother was in a care home, the First Minister said: 

 “Around 40 per cent of the care homes around the country now allow and enable indoor visiting” …  

We have received correspondence highlighting a Freedom Of Information response published by the Scottish Government on 5 November which sets out details of the source of this statement. While it is laudable that Scottish Government collected these estimates, it is clear from that FOI and information you shared with us that the ‘around 40%’ figure is a very loose approximation based on incomplete data. It is unsurprising that there is some uncertainty in the data, but this should have been more clearly reflected in the response and associated published materials. For example, it should have been clearer that:   

  • that some health boards either did not know how many care homes had initiated Stage 3 visiting or had not supplied data, especially given that two of those health boards, Greater Glasgow and Clyde and Lanarkshire, represent a significant proportion of the care homes in Scotland 
  • the figure referred to Stage 3 visiting which included both indoor and outdoor visiting 

The First Minister’s statement of ‘around 40%’ of care homes enable indoor visiting of residents suggested too much confidence in the figure. Some of the risk of this would have been mitigated if the management information had been published quickly in an accessible form. It should not have been necessary to wait for the information to be published as part of an FOI response.  

With increased scrutiny of all decisions, and a greater thirst for timely information from the media and the public, it is unsurprising that ministers and other public figures find themselves quoting management information in public forums, such as in parliament and in media interviews. You will be aware of the Office for Statistics Regulation’s published guidance on the production and use of Management Information by government and other bodies during Covid19, an important aspect of which is equality of access. You have confirmed to us your commitment to this principle which I welcome.  

I am very encouraged that Scottish Government has added detail to the visiting questions which ask boards for more information about care home safety. This seems to be a commendable initiative. I also welcome your commitment to make this data available as soon as possible with appropriate context and explanation to enable users to understand what the figures mean. 

Sir David Norgrove, UK Statistics Authority Chairman is writing to Willie Rennie MSP to make him aware of this correspondence. 

Yours sincerely 


 Ed Humpherson 

Director General for Regulation 


Mark Pont to Sandra Tudor: Land Use Change statistics and Land Use in England

Dear Sandra

Land Use Change statistics and Land Use in England

As you are aware, we recently completed our review of the compliance of the Land Use Change statistics against the Code of Practice for Statistics. I am pleased to confirm that these should continue to be designated as National Statistics. As part of this work, we also reviewed your official statistics on Land Use in England. We found several positive examples in the way that both sets of statistics are produced and presented by your team:

  • The statistics are clearly presented with the main points easily digestible. The team makes good use of tables, charts and maps to improve the clarity and ease of data interpretation.
  • The underlying data sets and sources are explained, and background information on the methodology and supporting quality assurance statement are published alongside the statistics to assist users in fully understanding the data. We welcome the team’s plans to publish an updated version of the methodology and quality assurance guidance to reflect the latest developments, including proposed changes to the team’s approach to measuring hectarage.
  • Demonstrating transparency through publishing a flow chart outlining the production and quality assurance processes, and publishing details of your assurances around the quality of the data.
  • The team has taken an innovative approach to increasing usability and adding to the user experience by using Power BI to produce an interactive report on Land Use and introducing topic factsheets. We welcome plans to explore the use of social media to broaden your reach to users and promote the statistics.
  • The positive relationship the team has with its data supplier – Ordnance Survey – which means any data queries identified during quality assurance can be dealt with effectively and your team is able to draw on Ordnance Survey’s technical expertise.
  • Plans to add further value to the Land Use statistics by publishing more detailed breakdowns of land use categories in response to user need.

Our review also identified several ways in which we consider that you could further enhance the trustworthiness, quality and value of the statistics:

  • In order to support transparency and ensure that all users are informed about delays to Land Use Change statistics due to delays gaining access to the latest Ordnance Survey data, publicly inform users through the appropriate channels, including the Land Use Change statistics webpage, the reasons for the delay and plans for forthcoming releases as soon as possible. Use these channels to communicate planned developments in advance and for users to feed in their views on proposals.
  • As definitions of land use have, and can, still be developed further, it is important to communicate clearly to users where updated or improved definitions used by Ordnance Survey and your team have led to changes in the statistics or their interpretation over time, and draw users’ attention to any potential discontinuities within the statistics and data tables.
  • Some of the commentary within the releases could be improved to:
    • add further insight to give a broader picture across topical aspects of land use where possible, by enhancing the statistical commentary to include material of relevant policies or related statistics on relevant topics (for example, Green Belt statistics or the National Planning Policy Framework).
    • aid users’ interpretation and understanding of the flow of land use between the different land use categories, for example by explaining the Sankey diagram used within the Land Use Change release more clearly.
    • communicate definitional differences in the term ‘developed land’ used in Land Use Change 2017-18 which is based on 2011 Census geographies, and that used in Land Use stock 2018-19, to avoid any confusion between the two definitions.
  • Your team explained that due to the highly visual nature of the releases there are several constraints in place preventing the publication of the statistics in the more accessible html format compared to pdf. We suggest speaking with the Government Statistical Service Good Best Practice team as it may be able to offer some advice.
  • Draw users’ attention to details of the Central Local Information Partnership (CLIP) meetings held on the Knowledge Hub website and consider publishing a summary of the relevant minutes and actions from the CLIP Planning Statistics Subgroup that your team attends, to demonstrate transparency about your approach to engaging with users and to help foster wider user engagement.

We appreciate the clear commitment shown to the continued development of these statistics through the public user consultation on proposed changes to the Land Use Change statistics and proposed additional statistics on Land Use stock. We suggest the team consider with users how greater insights could be drawn from using the two sets of statistics together, including exploring whether bringing them together into a single publication would add value. The team shared with us some of its future plans for getting the new releases peer reviewed within the wider directorate and with subject experts such as the Office for National Statistics geography team, as well as engaging with users about the new products. We welcome the news that work is progressing well with regards to procuring the new data from Ordnance Survey and particularly commend the team’s willingness to share knowledge and expertise with the other UK countries should they wish to develop their own land use statistics.

Your team told us that it would like Land Use statistics to be considered for a future National Statistics Assessment, which could include a potential reassessment of the Land Use Change statistics. Undertaking the steps outlined in this letter will go some way in preparing the statistics for this. We welcome the plans from your team to continue engaging with our Housing, Planning and Local Services domain as the statistics are developed over the coming months.

Please do not hesitate to get in touch if you would like to discuss any aspects of this letter further. I am copying this letter to Andrew Presland and Alexander Reynolds, the responsible statisticians, and Richard Field, head of housing and planning statistics at MHCLG.

Yours sincerely


Mark Pont

Assessment Programme Lead

Mark Pont to Clare Griffiths: Statistics on Cancer Survival in England

Dear Clare

Statistics on Cancer Survival in England

As you are aware, we recently completed our compliance check of Public Health England’s (PHE) statistics on Cancer Survival in England against the Code of Practice for Statistics. The COVID-19 pandemic led to the cessation of much non-urgent NHS treatment and care, which could have consequences for the survival of people with cancer. This means that these statistics will likely form a key role in examining the wider impact of the pandemic on the population. Following constructive conversations with John Broggio, the lead statistician for these statistics, I am pleased to confirm that these statistics should continue to be designated as National Statistics.

We found positives in the way that PHE produces and presents these statistics, which enhance their value and quality. For example,

  • PHE has streamlined separate geographical, adult and child survival statistics into one output which makes the information more coherent and presents some of the data in interactive graphs.
  • PHE is developing data on the survival of rarer cancers and planning to provide more granularity about cancers covering multiple areas of the body.
  • PHE plans to review the feasibility of linking data on income and deprivation to the cancer survival data, analyse and publish its findings.
  • The datasets present confidence intervals at the 95% level, to allow users to take into account uncertainty of the estimates provided in their use.

Our review also identified several ways in which we consider that the trustworthiness and value of these statistics could be further enhanced:

  • The methodology, references and related publications are all linked in the data tables and report, but these were first published by the previous producer (the Office for National Statistics) in 2019. To improve transparency and ensure that the documentation reflects the latest position, PHE should review and publish the methods and quality assurance documentation under its own name.
  • The statistical report discusses the main limitations and cautions against making comparisons across years, but the content generally is aimed at the expert data user, with little commentary and insight provided for those with less knowledge of the complexities of cancer survival. It is good that PHE plans to publish a summary in simpler language alongside the report and should work with less-expert users to develop the narrative to provide and enhance the insights offered to them.
  • PHE usually obtains a wide range of user views on its cancer statistics during its annual Cancer Data Conference. Unfortunately, due to the pandemic, this was cancelled for 2020. To supplement the stakeholder information currently available, PHE should consider engaging in some structured virtual stakeholder engagement exercises in lieu of the conference and use the feedback to develop the outputs further.

Thank you to you and your team for your positive engagement during this review. Our Health and Social Care team will continue to engage with PHE on progress in the coming months.

Please do not hesitate to get in touch if you would like to discuss any aspects of this letter further. I am copying this letter to John Broggio, Lead Statistician.

Yours sincerely

Mark Pont

Assessment Programme Lead


Related Links

Statistics on Cancer Survival in England

Ed Humpherson to Siobhan Carey: Assessment of Northern Ireland Planning Statistics

Dear Siobhan

Assessment of Northern Ireland Planning Statistics

We have reviewed the actions that your team has taken to address the requirements in Assessment Report number 350: Statistics on Planning in Northern Ireland.

On behalf of the Board of the UK Statistics Authority, I am pleased to confirm the awarding of National Statistics designation for Northern Ireland planning statistics.

These statistics are valuable as they provide relevant and trusted information on planning activity in Northern Ireland, and enable users to better understand the planning performance of Northern Ireland councils against statutory targets.

The team responsible for producing the statistics at the Department for Infrastructure (DfI) has responded very positively to the Requirements of our report. The team has made changes that enhance the accessibility of the statistics and the insight that they provide, increasing their appeal to a wider range of users. Users can now more easily access a range of data on which the statistics are based and will be more aware of planned future developments of the statistics, including those related to the transition to the new planning portal. In general, we welcome the team’s proactive approach to the continuous improvement of the planning statistics.

We are particularly pleased with the progress in exploring the feasibility of linking planning and house building data in NI, and the coordinating role that the team has played in building broad support for this work with statisticians across the UK. Being able to better understand the flow of planning approvals through to house builds was an issue raised by some users during the assessment, but also numerous users across the UK that we spoke to during our 2017 systemic review housing and planning statistics. The work undertaken by DfI statisticians to date is a significant step forward in addressing users’ key questions in this area, and we welcome the team’s continued engagement in cross-government discussions about this work.

We have included more detail about our judgement in an annex to this letter, which has been adapted from the DfI statisticians’ published action plan for meeting the assessment requirements. I, or my team, would be happy to talk to you or your colleagues through any aspects of this letter or Code compliance more generally.

National Statistics status means that official statistics meet the highest standards of trustworthiness, quality and value and is something to be celebrated. For future releases, we invite you to include a statement alongside the statistics which reflects the National Statistics status.

I am copying this letter to Michael Thompson, Lead Statistician DfI; Tracy Power, Director of Analysis NISRA; and Ruth Fulton, Chief Statistician’s Office.

 Yours sincerely


Ed Humpherson

Director General for Regulation


Annex A: Review of actions taken in response to Assessment Report 350: Assessment of Northern Ireland Planning Statistics, produced by the Department for Infrastructure, Northern Ireland (DfI)


Mark Pont to Sandra Tudor: Compliance Check of the English Housing Survey

Dear Sandra

English Housing Survey statistics including Estimates of leasehold dwellings As you are aware, we recently completed our review of the compliance of the English Housing Survey (EHS) statistics against the Code of Practice for Statistics. I am pleased to confirm that EHS statistics should continue to be designated as National Statistics. As part of this work, we also reviewed your experimental official statistics on the number of leasehold dwellings in England, as the EHS is the main data source for these estimates, and have offered some suggested developments in the annex to this letter.

We initiated this review following the public commitment we made in our 2020/21 Regulatory Work Programme to focus on statistics about key issues within Housing. We appreciate the positive and constructive way that the team engaged with us during the review, especially as we continue through these challenging times. We found a range of positive features that demonstrate the trustworthiness, quality, and value of the statistics:

  • the EHS statistics are comprehensive, accessible and professionally compiled. The EHS headline report presents an authoritative account of the main trends in household numbers, tenure and housing conditions, supported by visuals to aid interpretation (for example, the movers by tenure diagram). The separate topic reports and factsheets add additional insight on a range of themes
  • extensive technical and quality information is clearly and accessibly set out. This includes an assessment of the EHS against the European Statistical System quality dimensions, standard errors and confidence intervals published routinely alongside the results, and a technical report setting the overall methods approach in detail
  • quickly moving the EHS data collection to telephone interviewing this year and adapting the physical survey to take place outside of the property, with plans to draw on Energy Performance Certificate (EPC) and Valuation Office Agency (VOA) data to inform the adapted physical survey approach
  • introducing an innovative follow-up panel survey this year to explore how the circumstances of 5000 households have changed since the start of the pandemic
  • plans to improve the timeliness of EHS headline results by one-month from this year onwards by publishing in December rather than January, including the panel survey wave one results this year
  • regular engagement with the survey contractors – NatCen, the Building Research Establishment (BRE) and CADS Housing Surveys – through the year, and with policy users to inform decisions around EHS topic modules and thematic reports
  • developing experimental leasehold dwellings statistics to fill a longstanding data gap of policy and user interest. Further details are included in the annex to this letter · EHS data being a long-established source for wider reuse via the UK Data Service
  • plans to consider improvements to the accessibility of some EHS outputs where feasible, such as publishing in html and using tools, such as Power BI.

We identified some areas where the public value of the statistics could be improved, in order to continue to meet the high standards required of National Statistics:

  • To support transparency regarding planned developments of EHS statistics, including innovations to overcome the data collection challenges posed by COVID-19, publish details about future development plans for the EHS statistics and the team’s overall approach to user engagement, so that users are aware of planned changes in advance and are clear about the available channels to feed in their views
  • Adding further insight and value to the statistical commentary where possible, for example by including wider context such as referring to relevant policies and drawing on related third-party data sources (for example, additional insights from MHCLG Council Taxbase statistics on second homes)
  • The ability to compare housing conditions across the UK nations continues to be of interest for users. It is great the BRE has produced, on behalf of the four UK nations, “A snapshot of housing conditions throughout the UK”. It would be helpful to provide more-prominent links to this work to assist such users
  • Reduce the number of individuals granted pre-release access, wherever possible
  • Although the EHS outputs are primarily based on survey results, where additional administrative data are used (for example, EPC data), we expect producers to have their own assurance of the quality of each source and its suitability for their use individually, building on the assurances provided by the data suppliers. Producers should do this by engaging with data suppliers in a way that is proportionate to the materiality of each source in the production of the final statistics. Our administrative data quality assurance guidance may be useful to refer to when doing this.

The suspension of all face-to-face surveys due to COVID-19 has created challenges for the EHS data collection. We discussed this in detail with your statisticians and heard of their innovative work and plans in this context, such as moving to panel, telephone and online data collection, and trailing the use of an app for respondents to video record evidence inside the property for the physical survey. We welcome that EHS statisticians share knowledge of these planned developments with other UK and Republic of Ireland statisticians through the Five Nations House Conditions Surveys group. Given the uncertainty and changing nature of events, we welcome that the team has agreed to keep in contact with us as their plans progress.

Please do not hesitate to get in touch if you would like to discuss any aspects of this letter further. I am copying this letter to Reannan Rottier, EHS Lead Statistician.

Yours sincerely

Mark Pont

Assessment Programme Lead


Annex 1: Suggested developments for leasehold dwellings statistics

The published experimental leasehold dwellings statistics are a welcome innovation, and were developed in 2014-15 to fill a longstanding data gap of policy and user interest. The leasehold dwellings statistics can be used to inform the development and monitoring housing policies at a regional, and national level, including monitoring the impact of regulating the leasehold market. We welcome the improved timeliness of the 2018-19 leasehold dwelling statistics and support the team’s ambitions to publish them as part of the main EHS outputs in 2021. Our review has identified some areas that we consider it would be good for MHCLG to consider as part of its own developments, to enhance the trustworthiness, quality and value of the leasehold dwellings statistics:

  • Develop the statistical commentary to highlight the key questions that the statistics are able to answer, for example in relation to pertinent debates, relevant government policies, initiatives or targets
  • Better insight could be drawn from the statistics by describing the various approaches that have been taken to estimating the number of leasehold dwellings, and their strengths and weaknesses. For example:
    • explain more prominently in the bulletin the difficulties in accurately counting the number of leasehold dwellings, building on material already included on how the EHS leasehold measure alone warranted a new approach
    • given the slightly confusing landscape of leasehold situations, explain more clearly the differences between sets of estimates such as:
      • why numbers for social sector leasehold dwellings are low at 234 thousand compared to 4.2 million the private sector
      • why social and private sector estimates are not comparable (i.e. that leasehold properties in the social sector, the local authority or housing association is the leaseholder, not the social tenant).
    • Update the linked 2014 Residential leasehold dwellings in England Technical paper, to provide clarity around some aspects of methods and quality, including:
      • the statisticians’ assurances around the suitability of the individual administrative sources used. A low level of data quality concern is reported for the statistics overall, but the rationale for this rating is not clear and a limited assessment of assurance for the individual administrative sources is provided. Assurance should be informed by an understanding of the data collection operational context and quality assurance for each source, and the nature of the arrangements and relationships held with third-party data suppliers. Our administrative data quality assurance guidance may be useful to refer to when doing this
      • explanations around the statistics evaluation in reference to additional sources available triangulate the results or enhance insights (for example, the Regulator for Social Housing is due to publish new statistics on the number of units held by private registered providers on a leasehold and freehold basis in 2021)
      • details on the new calibration methods and the rationale for their use over the previous imputation approach
      • the strengths and limitations of the EHS sample design for measuring the extent of the leasehold population, given the skewed distribution of leasehold properties (with around two thirds of leasehold dwellings accounted for by flats in London and houses in the North West)
      • details of the matching methods used and the characteristics of unmatched cases
      • confidence intervals for lower-level estimate breakdowns to provide greater clarity around potential uncertainty in