Mark Pont to Victoria Obudulu: Winter Coronavirus (COVID-19) Infection Study

Dear Victoria

Compliance review of statistics in development from the Winter Coronavirus (COVID-19) Infection Study, England and Scotland

Thank you very much for inviting us to independently review the trustworthiness, quality and value of the statistics in development from the Winter Coronavirus (COVID-19) Infection Study (Winter CIS). The Winter CIS was launched by the Office for National Statistics (ONS) in November 2023, covering households in England and Scotland. On 21 December 2023, using raw survey data from the Winter CIS, UKHSA first published a series of reports titled Estimates of Epidemiological Characteristics.

Information about COVID-19 (that can be caused by the coronavirus SARS-Cov-2) continues to be of high interest for expert users, such as public health and NHS officials, health researchers and the media, who have used your Winter CIS estimates over the winter for a number of reasons:

  • to act as an early warning for any potential outbreaks of COVID-19 for those planning and delivering the NHS and other services preparing for winter stressors.
  • to observe the prevalence of coronavirus variants in the community.
  • to supplement and triangulate existing coronavirus community surveillance data, such as acute respiratory infection incidents.
  • to raise public awareness of COVID-19 infection levels in England and Scotland.

Involving users to assist with continuing development

I welcome the innovation and agility shown by your team to develop the analyses throughout the study period. This has been demonstrated on two fronts: through transparent joint and collaborative working, with continuous learning, allowing your team to iterate your working processes; and by trialling a different methodology while encouraging expert user feedback.

We note from a recent Blog by ONS that statistics to capture information about respiratory viruses will continue to evolve and we consider that retaining the label ‘official statistics under development’ seems appropriate. To inform future iterations of infection studies more generally, it is good practice to continue to engage with users of the Winter CIS to help develop the methods, data and metadata transparently.

Coherence across the UK

Both ONS and yourselves have been very clear to link to other statistics about the coronavirus and COVID-19. We welcome this approach to provide coherence with existing coronavirus data from the rest of the UK.

Users of the Winter CIS have identified the need for comparable statistics across the UK, particularly to understand the effects of coronavirus on the lives of individuals. During any future planning for a study of this nature, we consider that time should be taken by statistics producers to fully consider the possibilities for the development of coherent UK-wide statistics.

Ensuring transparency of methods

It is good that you have published a Quality and Methodology Information report which explains the uncertainty, strengths and limitations of the estimates that you have published. To produce these modelled estimates, you used Multi-level Regression with Poststratification (MRP) methodology. The statistics team told us it has submitted the methodology and data for academic peer review, which is expected to be published and so made available for public scrutiny, which is good practice for official statistics in development.

Insight and forward plans

Over the course of the study, as well as descriptive analyses about the latest estimates, you have published an Explainer article and followed up with news releases so that summary information is made more accessible for the less expert user. Once you have finalised publication dates for future research or analyses using Winter CIS data, such as the modelled Infection Hospitalisation Risk for Scotland, as well as using informal ways to let people know, it would be good practice to also pre-announce these as ad hoc official statistics.

I would like to thank your team for its positive engagement with us during this review. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter.

Yours sincerely

Mark Pont

Mark Pont to James Tucker: Winter Coronavirus (COVID-19) Infection Study

Dear James

Compliance review of statistics in development from the Winter Coronavirus (COVID-19) Infection Study, England and Scotland

Thank you very much for inviting us to independently review the trustworthiness, quality and value of the statistics in development from the Winter Coronavirus (COVID-19) Infection Study (Winter CIS). Information about coronavirus continues to have high interest for expert users, such as public health and NHS officials, health researchers and the media.

The Winter CIS was launched by ONS in November 2023, covering households in England and Scotland. Your team published data from the study for transparency every two weeks from 7 December 2023 to 14 March 2024. Raw results from the study were also sent securely to the UK Health Security Agency for onward analysis to allow that organisation to estimate the incidence and prevalence of coronavirus SARS-CoV-2, which can cause COVID-19.

I welcome the innovation and agility shown by your team. This has been demonstrated through transparent joint and collaborative working with the UK Health Security Agency, with continuous learning, allowing your teams to iterate your working processes. We note from your recent Blog that statistics to capture information about respiratory viruses will continue to evolve and we consider that retaining the label ‘official statistics in development’ seems appropriate.

Explaining study representativeness

As you know, in random sample surveys it is important that the achieved sample adequately represents the proportions of various characteristics (age, ethnicity, locality) shown by the community under scrutiny. This will reduce biases to the survey results and provide a closer estimate to the true value. The design of the Winter CIS as a longitudinal panel survey, since it is based on a cohort of people who responded to the Coronavirus (COVID-19) Infection Survey (CIS), means that it is not a random sample survey.

Even with an imbalanced sample, appropriate weighting helps to ensure representative estimates. You have gone some way to providing transparency about the accuracy of the Winter CIS estimates, by publishing the survey results data, the response rates for various survey waves and the population basis that was used to calculate the estimates. You have also published a Quality and Methodology Information report (QMI) which makes clear that the survey might over-represent or under-represent certain groups and sets out information about what the survey data can and cannot be used for. The QMI provides some detail on the data collection procedures, quality assurance processes, response rates and representativeness. Please ensure it is updated to include more detail to fully explain how you have reduced biases in the design and weighting of the study, as outlined in the paragraph below.

To assist further development of these and other statistics based on the Winter CIS sample frame, (such as the ad hoc release titled Experiences of GP practice access), it would be helpful for researchers to fully understand the various recruitment processes, the study design, the methodological procedures and the weighting choices that have been taken. For both the Winter CIS and other releases that use the Winter CIS as a sample frame, you should publish information about:

  • The distinction between the processes of participant recruitment and engagement, giving consideration to digital inclusion and the impact of incentives, or their removal.
  • The maturation of the dataset over time, to understand the personal characteristics of the respondents over subsequent waves of the study.
  • The statistical inferences made during the course of calculating the estimates, including the weighting decisions.

We are pleased to hear that you have agreed to publish updated metadata by the end of June and place the final survey results and weighting data into a repository to further academic research on it, such as ONS’s Integrated Data Service.

Coherence across the UK

Both UKHSA and yourselves have been very clear to link to other statistics about the coronavirus and COVID-19. We welcome this approach to provide coherence with existing coronavirus data from the rest of the UK.

Users of the Winter CIS have identified the need for comparable statistics across the UK, particularly to understand the effects of coronavirus on the lives of individuals. During any future planning for a study of this nature, we consider that time should be taken by statistics producers to fully consider the possibilities for the development of coherent UK-wide statistics.

I would like to thank your team for its positive engagement with us during this review. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter.

Yours sincerely

Mark Pont

Mark Pont to Alastair McAlpine: Compliance Check of the Scottish Health Survey Statistics

Dear Alastair

Compliance Check of the Scottish Health Survey Statistics

We recently carried out a compliance check of the Scottish Health Survey (SHeS) statistics against the Code of Practice for Statistics. These statistics were designated as National Statistics in 2010, and we are pleased to confirm that they should continue to be accredited under the status now known as accredited official statistics (AOS).

The SHeS is a longstanding annual survey in Scotland providing data about the health of the population on a diverse range of topics including mental and physical health; diet and food security; and alcohol and smoking. It therefore provides a vital picture of the health of the Scottish population living in private households. It is used by the Scottish Government, Public Health Scotland and many health organisations, including charities for planning or for decision-making. The Scottish Parliament also regularly refers to the survey results when debating health issues. Endorsed by the Chief Medical Officer in Scotland, the SHeS has a Project Board consisting of stakeholders, the survey contractor and users which helps to increase confidence in the statistics. The Project Board also reviews the survey topic questions used each year to ensure that they remain relevant and are both appropriate and proportionate for the survey interviews, which span multiple age groups.

In carrying out our review, we consulted some users and took their views into account. We have set out our findings along with some recommendations in the annex that we consider would help to further improve the statistics. We will follow these up informally with the team as it works towards the next release.

I would like to thank your team for its positive engagement with us during this review. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter.

Yours sincerely

Mark Pont

Assessment Programme Lead


Annex: OSR review findings and recommendations – Scottish Health Survey (SHeS)

Trustworthiness (T) findings

T1. The Chief Medical Officer for Scotland endorses the Scottish Health Survey (SHeS) in the Foreword. This acknowledgement credits the insights deduced by the collaborative efforts of the stakeholders involved in the production of the survey and the time given by the participants to inform future health policies. The team has a Project Board with representation from many key user organisations as well as other stakeholders closely aligned to the survey. The team sets out suggestions for changes and the Board reviews them and implements accordingly. This oversight group helps to increase user confidence in the statistics. Both of these examples demonstrate trustworthiness through transparent and independent decision-making and leadership.

T2. The Project Board decides on the questionnaires’ content, ensuring that the questions are relevant, the questionnaire lengths are appropriate for the varying age groups, and the respondent burden is minimised. The team undertook a questionnaire content review to learn about users’ views on the content of the SHeS. However, although the review responses summary has been published, the report summarising the updated questionnaire content is yet to be published so users remain unaware of any changes planned. We welcome the publication of this report in a timely manner, particularly as users reported that they were uncertain whether their suggestions had been taken forward.

Recommendation for improvement: T2

To make the decisions behind changes to the survey questions more transparent to users, the Scottish Government (SG) should ensure that the content report includes the criteria used to determine whether questions are continued, altered, added or removed each year. Where there are significant changes, these should be clearly highlighted and evidenced, establishing an audit trail to inform current and future users.

Quality (Q) findings

Q1. We found that there was much effort put into describing the processes used to recruit survey participants in detail and overcoming the challenges during Covid-19 in the Technical Report, all of which is helpful information for the reader. The fieldwork methodology report describes in detail how required sample sizes were achieved and how households were selected using the Postcode Address File.
However, the information on the processes for sample design and actual respondent engagement appears to be interspersed in some parts of the methodology. Thus, some users might find it challenging to distinguish and fully appreciate the technical aspects of the survey, such as:

  • The recruitment procedures
  • The sample estimation
  • The collection and treatment of data

Recommendation for improvement: Q1

SG should seek feedback from a range of users about its methodology documentation and use this feedback to update the documentation accordingly to ensure it is as effective as possible.

Q2. We found that there are multiple references to the term ‘core questions’, which generally refers to all the questions asked in the relevant year to both adult samples. These references are further complicated by the fact that only a small number of questions included in the Scottish Health Survey, the Scottish Household Survey and the Scottish Crime and Justice Survey as part of the Scottish Surveys Core Questions are also referred to as the ‘core questions’. To aid user understanding, SG could clarify which set of ‘core questions’ are being referred to each time the term is mentioned to avoid any confusion between the different sets of questions.

Recommendation for improvement: Q2

As part of its wider user engagement, SG should determine whether users understand the references to ‘core questions’ throughout the documentation. If the similar terminology is found to confuse readers, SG could consider the use of alternative language that more clearly differentiates between the sets of questions.

Q3. The new dashboard provides 95% confidence intervals, which we heard was useful to many users. We found varying levels of the representation of uncertainty throughout the survey outputs, but the summary report doesn’t contain any information about uncertainty. Users told us that they would find it helpful to see confidence intervals in the supplementary tables as well.

Recommendation for improvement: Q3

SG should include confidence intervals in the supplementary tables to ensure users can better understand the quality of the estimates. SG could also review the presentation of uncertainty in other outputs and consider whether including further information about uncertainty might be helpful.

Q4. In the main report, we found that in some visual representations, the changes in attained values or scores can be ambiguous. For example, for the average mental wellbeing scores, the mean Warwick-Edinburgh Mental Wellbeing Scale (WEMWBS) scores given on the associated Mental Health and Wellbeing chart are not clearly labelled or titled, and the decline in average scores between 2019 and 2022 could be open to misinterpretation. It is important that users clearly understand how to interpret the score changes from year to year in the time series analysis represented in charts.

Recommendation for improvement: Q4

To provide users with sufficient information to correctly interpret scores or values given in the chart, SG should ensure that all charts are clearly labelled with explanations of whether yearly fluctuations are statistically significant or not. This should increase the likelihood of accuracy of when data and charts are interpreted.

Q5. We noted that the team included a gender identity question between 2018 and 2021 onwards as part of the self-completion questions as this was an SG Scottish Survey of Core Questions (SSCQ) harmonised question. However, this question was removed from 2022 as it was no longer an SG SSCQ core question and was not being used for SHeS analysis purposes. The survey continues to include a question on sex (with possible responses ‘male’, ‘female’, ‘prefer not to say’) as this is an important characteristic for analysis of the survey results. The team told us that it would consult our recent guidance on Collecting and reporting data about sex and gender identity in official statistics when considering any future data collection changes in this area.

Value (V) findings

V1. In recent years, the SHeS has undergone some methodological changes, and during the Covid-19 pandemic in 2020, field work was suspended and a telephone survey was conducted instead based on a shortened version of the questionnaire and some Covid specific questions. The 2020 data were presented as experimental statistics due to differences in the profile and bias in the achieved sample compared to pre-Covid years and have not been included in the time series analysis since then. The absence of these data is clearly explained so that users are aware of why there is a gap in data tables, charts and on the dashboard.

V2. We were pleased to hear that the value of the SHeS data has been enriched by further data linking. The team told us that work done to link the Postcode Address File (PAF) data to the Community Health Index (CHI) database has increased the likelihood of identifying households with children under 16 making it a more efficient way to source child samples. There is also information explaining how the SHeS and health record data are linked and detailing the variables included. This will improve researchers’ access to the datasets, helping to obtain further valuable insights into the health of Scotland’s population.

V3. The objectives of the survey are clearly outlined in the main report. Users told us that they sometimes use other survey findings to complement the findings of the SHeS; for example, they might use the Alcohol Toolkit Study by the University College London  when determining trends in alcohol use. Others mentioned that further information clarifying whether other sources and trends could be used or not would be helpful.

Recommendation for improvement: V3

To support SHeS users in considering how they might use SHeS statistics alongside other non-official sources, SG should provide further guidance, where practicable, about the extent to which SHeS data could be used alongside other reputable sources.

V4. SG has developed an innovative SHeS dashboard which provides comparative data going back to 2008 and is presented at national, local and health board levels. It is encouraging that RAP processes are being implemented and that further development work is planned. We heard that users have requested further breakdowns and were not always clear on whether they were being considered. It is encouraging to hear that SG intends to consult users on its plans to provide further breakdowns, such as employment type, urban/rural and health board/local authority data by the Scottish Index of Multiple Deprivation (SIMD).

Recommendation for improvement: V4

As SG develops its dashboard plans, it should communicate these so that users are more aware of the plans to progress their requests.

V5. Having comparable health statistics enables users to more easily observe trends across the UK countries. We found differences in the comparability and coherence of the survey across the survey topic areas that would hinder users in drawing such inferences.

Recommendation for improvement: V5

To provide further explanation on when statistics on a specific topic can be compared across UK countries, SG should include a more in-depth section on comparability. This will help to give users greater insight on the extent to which they can compare SHeS topic areas with other similar survey outputs in countries across the UK.

Ed Humpherson to Jonathan Waller: Assessment of the Higher Education Graduate Outcomes Data and Statistics

Dear Jonathan

Assessment of the Higher Education Graduate Outcomes Data and Statistics

We have completed our assessment of the Higher Education Graduate Outcomes Data and Statistics produced by Jisc, under the Higher Education Statistics Agency (HESA) brand. I am grateful for the positive engagement from your team throughout the assessment process.

Since their introduction in June 2020 for the 2017/18 academic year, these statistics have continued to be an invaluable source of information on the outcomes and destinations of graduates from Higher Education (HE) in the UK. These statistics are a vital source of information on graduate outcomes in the UK and are used by a wide range of stakeholders including higher education providers and regulatory bodies across the UK.

Jisc has shown a commitment to trustworthiness through its understanding of its responsibilities in important areas such as data security, professional capability, and intelligent transparency. By fully maximising the positive relationships with its key stakeholders, Jisc has ensured that the needs of users are embedded in the production and continued development of these statistics.

The quality of these statistics is very high. The survey is well designed to meet users’ needs and achieves good response rates at a national level. We commend the transparency around the quality of these statistics and that a large amount of information is provided in the accompanying user guide.

Jisc’s engagement with its statutory customers and other key users through the provider forum, steering group and other means, demonstrates a commitment to strong user engagement practices. We welcome Jisc’s plans for process modernisation including the implementation of Reproducible Analytical Pipelines (RAP) as well as the Graduate Outcomes Data being made available for future research, including data linkage, on the Office for National Statistics’ Secure Research Service.

Our assessment found that these statistics adhere to the Code of Practice closely. Through the range of actions that Jisc has taken to develop these statistics, and in its response to our assessment, Jisc has shown a high level of dedication to the principles of trustworthiness, quality and value.

We judge that the Graduate Outcomes Data and Statistics can be confirmed as accredited official statistics (called National Statistics in the Statistics and Registration Service Act 2007). We would like to acknowledge that it is rare for OSR to accredit official statistics without first providing producers with requirements and that this reflects our high regard of these statistics as well as Jisc’s approach to continually maintaining the high standards of the Code of Practice for Statistics.

Yours sincerely

Ed Humpherson

 

Related Links:

Assessment of compliance with the Code of Practice for Statistics: Higher Education Graduate Outcomes Data and Statistics

Jonathan Waller to Ed Humpherson: Higher Education Graduate Outcomes Statistics Assessment Request – October 2023

Ed Humpherson to Jonathan Waller: Higher Education Graduate Outcomes Statistics Assessment Request – October 2023

Ed Humpherson to Chris Roebuck: Request for assessment of NHS England Cancer Waiting Times Statistics

Dear Chris

Request for assessment of NHS England Cancer Waiting Times statistics

Thank you for your letter of 1 March requesting an assessment of NHS England’s Cancer Waiting Times statistics.

We note the changes that you have made to these statistics in response to programme developments and user feedback. Given the high-profile nature of the statistics, I welcome your aspiration to publicly demonstrate your compliance with the Code of Practice for Statistics.

I agree with your suggestion that the assessment should commence no earlier than Autumn 2024 to allow time for the statistics to bed in. With this in mind, I am pleased to confirm that my team will be in touch with you shortly to confirm timing for this assessment.

Yours sincerely

Ed Humpherson

Mark Svenson and Chris Roebuck to Ed Humpherson: Request for assessment of NHS England Cancer Waiting Times Statistics

Dear Ed

Request for assessment of Cancer Wait Times statistics

Following the introduction of the new NHS waiting times cancer standards from 01 October 2023, we have made a number of changes to our cancer wait times statistics. The changes to the cancer waiting times publication are given on the cancer waiting times statistics publication page and were prominently mentioned in the first statistical release following the announcement. We have continued to give details of the changes in the statistical releases following the changes in the December publication.

Our development of the statistics was informed by programme developments and feedback received from the wider user community. The recent consultation had widespread support for the new completed standard measures. Feedback was received requesting further disaggregation by tumour type, and this is part of our changes to the statistics. No feedback was received requesting information to be published on the stock of patients waiting more than 104 days for treatment, therefore, balancing resource with user needs we do not plan to publish this information. We publish the distribution of the waited times for the 62-day combined completed standard, and this includes those who have waited more than 104 days for a treatment.

In accordance with Section 12(1) of the Statistics and Registration Act 2007, we would welcome a full assessment of these statistics given their continued high profile. We feel that the timing of this would be best once the statistics have bedded in. We anticipate this would be Autumn 2024 at the earliest.

We are copying this letter to our statistical governance team who are ready to pick up the next steps with the OSR team.

Yours sincerely,

Mark Svenson and Chris Roebuck

Mark Pont to Jonathan Russell: Compliance Check of Valuation Office Agency (VOA) Council Tax statistics

Dear Jonathan

Compliance Check of Valuation Office Agency (VOA) Council Tax statistics

We have recently completed a compliance check of your council tax statistics publications, Stock of Properties, 2023 and Challenges and Changes in England and Wales, March 2023 against the Code of Practice for Statistics. I hope that our reflections are helpful as you continue to develop these important official statistics. Some of the findings may also be more widely applicable to your non-domestic rating statistics, and we encourage your team to consider that broader applicability. If you would find it helpful for us to specifically review those, please do get in touch.

Your council tax publications are valuable to users who are trying to understand how council tax affects them. Breakdowns by house characteristics and number of bedrooms in the Stock of Properties release help users compare their banding with others. The Challenges and Changes publication indicates to users how likely a banding review is to increase, decrease or not change a property’s banding.

Our review found a range of positive features that demonstrate the trustworthiness, quality, and value of the statistics:

  • It is commendable that VOA has made its data available to other government departments. The use of council tax property attribute data by the Office for National Statistics (ONS) to supplement census data, and the Department for Levelling Up (DLUHC) use of the valuation list for its council taxbase statistics highlight the impact of VOA’s positive approach to data sharing and the additional statistical insights and increased efficiencies that this can lead to. For example, sharing VOA property attribute data with ONS has meant that a previous question on the number of rooms in a dwelling could be removed from the 2021 Census. VOA property attribute data is also a key source used to calculate the UK House Price Index.
  • The statistics provide granular and clear insights. They include helpful contextual information which allows users to correctly interpret the statistics and their structure is easy to follow. There is a good level of commentary and background notes to help users understand how council tax works, which will aid users in using the statistics appropriately. For example, the helpful flowchart showing the stages of the banding appeals process in the Challenges and Changes background notes outlines each stage from proposal to final decision.
  • The explanations of the methodology for the statistics should be commended, for example it is explained that the statistics are produced using ONS’s statistical geographies and National Statistics Postcode Look-up, which will help users understand the comparability of the statistics.
  • You have demonstrated a commitment to trustworthiness and transparency by making users aware of data limitations. This includes noting a key limitation, that the attributes of properties are updated only when they are sold in the background notes. The inclusion of a revisions policy and quality policy and details around VOA’s statutory duty in the background information, also show a commitment to transparency. Your team told us about your plans to publish more information on the extent of revisions made in the statistics. This should reduce the likelihood of misinterpretation and misuse, especially in the advent of legislative changes such as the planned revaluation of council tax banding in Wales, which you told us may lead to improved data quality and potential revisions to the statistics.
  • The implementation of Reproducible Analytical Pipeline (RAP) principles into the production of the statistics to improve the efficiency and reliability of the process is commendable and demonstrates a commitment to continuous improvement.
  • Your plans to further develop the statistical outputs, for example by introducing a flows measure between domestic and non-domestic tax, will give users a better understanding of the stock of taxable buildings. The potential introduction of dashboards and maps is also exciting and will be valuable to users through new ways to interact with your statistics.

To further enhance the trustworthiness, quality and value of these statistics, we have identified ways the statistics and their presentation could be improved:

  • Within the background documentation for each release you welcome feedback from users via email. However, there is little published information about your overall approach to user engagement. Having a clear user engagement strategy, especially in the context of the future planned development of these statistics, will allow users to benefit from knowing what changes are coming and facilitate an ongoing dialogue with your users.
  • The background documents for each publication are clear and well-structured but some of the information is quite general. Including specific details about approaches to data collection and assuring data quality for each set of statistics would reassure users. Publishing information, for example, around potential uncertainty in the local authority and house attribute data captured as part of a property valuation, will facilitate the appropriate interpretation and use of the data by users. This is important given the wider use of this data by other government departments. More published information, in the form of a data journey for example, could give an indication as to where uncertainty in the data may arise.
  • It was good to hear that you have previously used Quality Assurance of Administrative Data (QAAD) principles when exploring new data sources, however, we would encourage you to communicate this work to users as part of your documentation about quality. This will help users understand the quality of the source administrative data and how it can be used appropriately. A thorough appraisal of data quality will be especially important in the context of changes to data sources such as the planned changes to VOA administrative systems and the revaluation of Welsh council tax bandings. Given the extensive use of the VOA council tax and property attribute data across government, we encourage you to work with these stakeholders while you undertake future reviews of administrative data quality. This will allow you to understand data quality in relation to different uses, share resource and tailor your efforts accordingly.
  • Due to the difference in council tax policy between England, Wales, and Scotland it is important that the coherence and comparability of council tax statistics is made clear to avoid misleading comparisons. Stating which statistics can be compared and including links to relevant statistics is a good step to reducing the likelihood of misuse. The appropriate comparison of statistics is especially important with an approaching general election.
  • We welcome your commitment to reduce the number of individuals on your pre-release access (PRA) lists for both publications. This shows your commitment to trustworthiness and will reduce the likelihood that the statistics are shared more widely than is essential ahead of publication. However, your PRA lists for each publication are long, and to support ensuring proportionate access we recommend that the PRA lists are reviewed each time the statistics are published.
  • The cover sheets and notes provided in the Microsoft Excel data spreadsheets provide a good amount of information that will help users use the statistics appropriately. We recommend that similar notes are included with the CSV files as just providing raw data could lead to misinterpretation and/or misrepresentation.

I would like to thank your team for their positive engagement during this review. Please get in touch if you would like to discuss any aspects of this letter or if we can be of further assistance.

I am copying this letter to Tetyana Mykhaylyk, Director of Information, Data and Analysis, VOA; Colin Yeend, Head of Research & Analysis, Information, Data and Analysis Directorate, VOA; Sarah Windass and Anna McReady, the responsible analysts; and Sean Whellams, Head of Profession for Statistics, HM Revenue and Customs.

Yours sincerely

Mark Pont

Ed Humpherson to Ian Lonsdale: Assessment of Statistics from the People and Nature Survey England

Dear Ian

Assessment of statistics from the People and Nature Survey for England

Following our assessment of statistics from the People and Nature Survey for England produced by Natural England, we set out four requirements that we judged that you needed to meet in order for these statistics to become accredited official statistics. Accredited official statistics are called National Statistics in the Statistics and Registration Service Act 2007.

We have independently reviewed the actions that your team has taken to address these requirements. On behalf of the Board of the UK Statistics Authority, I am pleased to confirm that they comply with the standards of trustworthiness, quality and value in the Code of Practice for Statistics and should be labelled accredited official statistics.

We have been impressed with your team’s engagement and willingness to address these requirements.

I am copying this letter to Lindsey Clothier, Deputy Head of Profession for Statistics at DEFRA, and to the team at Natural England – Tom Marshall, Caitlin Clark, and Sergio Milheiras.

Yours sincerely

Ed Humpherson
Director General for Regulation

 

Related Links:

Annex: Review of actions taken in response to Assessment Report 374: Statistics from the People and Nature Survey for England (PaNS), produced by Natural England

Ed Humpherson to Ian Lonsdale: Assessment of statistics from the People and Nature Survey for England

Assessment of compliance with the Code of Practice for Statistics: Statistics from the People and Nature Survey for England

Mark Pont to Sian Rasdale: Official Development Assistance Statistics

Dear Sian

Official Development Assistance Statistics

We recently completed our compliance check of FCDO’s Official Development Assistance (ODA) Statistics against the Code of Practice for Statistics. These are important statistics that measure aid flows by UK official agencies or their executive agencies (including but not limited to FCDO) to developing countries and multilateral organisations in line with the international definition of ODA. They include total UK spend on ODA, and as a proportion of Gross National Income (GNI) in the form of the ODA:GNI ratio. These statistics have been of significant public interest not least since 2020, when the UK government announced a temporary reduction in its target spend for ODA from 0.7 per cent of GNI to 0.5 per cent of GNI effective from 2021. More recently the statistics were referred to in the media with reporting of the FCDO’s ODA programme data for its international programme for the next two years.

Our review found that the statistics demonstrate widespread compliance with the Code of Practice and should continue to be designated as National Statistics. It is clear both from the published outputs and from talking to your team how much hard work has gone into the development of these statistics and how much ambition there is to further improve them.

We found a range of positive features, as well as some areas for improvement that we consider would enhance the trustworthiness, quality and value of the statistics:

  • We commend your team for its responsiveness to user feedback by developing a dashboard, which aims to provide users with much greater insight. The dashboard when finalised, will enable users to interact with the data will bring greater transparency of UK aid data.
  • It is also good that your team considered the public value that Gross Public Expenditure statistics provided, in response to user feedback which found these statistics no longer remained relevant or useful, discontinued their production.
  • The merger of the former Foreign and Commonwealth Office and the Department for International Development resulted in the need to bring together two separate reporting accounting systems and different methods used to estimate ODA-eligible administrative costs. After reviewing these methods, against the OECD Development Assistance Committee (DAC) directives, FCDO has introduced coherence in the reporting accounting systems and an interim method to calculate the ODA-eligible proportion of FCDO administration costs. We commend FCDO on the transparency of its interim methodology.
  • In 2018 OECD DAC introduced changes to the methodology used to calculate ODA, which affected the treatment of loans. To help the public understanding and address any misconceptions of what ODA includes, FCDO published clear technical notes to explain how ODA is calculated. However, to further improve the transparency, FCDO might publish additional information explaining to users where the OECD guidance can be applied directly and where it needs to be interpreted in the UK’s local context.
  • Using a new centralised financial information database, FCDO has recently updated information on data sources, showing the wide range of other government department administrative sources used to compile the ODA statistics. The new database will also bring further opportunities to improve data collection processes. To reassure users about the quality of these sources and improved data collection processes, FCDO should consider updating its Quality Assurance of Administrative Data (QAAD) using OSR’s QAAD toolkit to support this process.

Looking beyond Statistics on International Development:

  • In October 2022 the Independent Commission for Aid Impact (ICAI) raised four recommendations in its Transparency in UK Aid review, which included:
    1. Setting and applying standards for transparency to all aid portfolios (including arm’s-length bodies)
    2. Committing to achieving a standard of ‘very good’ in the Aid Transparency Index by 2024
    3. Resuming the publication of forward aid spending plans, cross departmental development results and country aid priorities
    4. Working with other donors to support greater use of International Aid Transparency Initiative (IATI) data.
  • In December 2022 FCDO published a response to the review, where it has:
    1. Rolled out of a single finance and HR IT system throughout the department, which has allowed all ODA programmes in the FCDO to be brought onto a single platform. This has allowed for a more unified transparency process, with systems to assess, approve and collate ODA programme data.
    2. Committed to achieving a standard of ‘very good’ in the Aid Transparency Index by 2024 and publish forward looking ODA allocations and is carrying out improvements to publications of country development strategies.
    3. Committed to publishing annual reports and accounts information that support accountability and describe programme allocations clearly.
    4. Committed to supporting greater use of IATI data across all recipient countries of ODA.
  • To further improve transparency, it would be helpful if FCDO published its plans on how it will address the publication of arm’s-length bodies information, along with a record of any significant decisions to continue, discontinue, adapt, or provide statistics through other means. These decisions should be supported and driven by user feedback.

We would like to thank you and your team for your positive engagement during this review. In order to continue complying with the Code, we ask that you report back to us with your plans for addressing our recommendations above by December 2023. Our Economy, Business and Trade domain will continue to engage with you and your team on progress in the coming months. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter further.

I am copying this letter to Jane Casey, Head of ODA delivery, eligibility and reporting.

Yours sincerely

 

Mark Pont

Ed Humpherson to Scott Heald: Assessment of Accident and Emergency (A&E) Activity Statistics in Scotland

Dear Scott

Assessment of Accident and Emergency (A&E) Activity Statistics in Scotland

We have completed our assessment of the weekly A&E activity statistics produced by Public Health Scotland (PHS). We judge that these statistics can be designated as National Statistics once we have confirmed that the requirements set out in our report have been met. We have also carried out a compliance check of the monthly waiting times statistics and I am pleased to confirm that these statistics should continue to be designated as National Statistics. The weekly and monthly statistics are now available together on a new platform reporting statistics on A&E departments across Scotland. This platform is a positive and innovative development and welcomed by many users in terms of improved accessibility and functionality.

Waiting times continue to be of high public interest across the UK and I am grateful for the positive engagement from you and colleagues in the A&E statistics team throughout the assessment process. It was particularly encouraging how the team adopted a proactive approach and implemented many of our emerging recommendations during the development of the new platform. Of key importance is your ongoing work to ensure that users are aware of the differences in hospital site inclusions between the weekly and monthly statistics to support appropriate use of the statistics – for example, only the monthly statistics include all units and departments relevant to the Scottish Government target that 95% of people attending A&E should be seen, admitted, discharged or transferred within four hours of arrival. You publish some of the disaggregated datasets only monthly. This should be kept under review in case there is further user demand for that level of granularity on a more frequent basis.

We identified three requirements that would strengthen both sets of statistics: extending user engagement to a wider range of users, assisting users to make appropriate cross-UK comparisons, and consider ways to communicate uncertainty to further aid user interpretation of the statistics. Implementing these requirements will ensure that the statistics continue to meet the highest standards of the Code of Practice for Statistics. As discussed, we expect you to report back to us by September 2023 on how you have met these requirements.

I am copying this letter to Emma McNair, Information Consultant at Public Health Scotland.

Yours sincerely

Ed Humpherson
Director General for Regulation

 

Related Links:

Assessment of Accident and Emergency (A&E) Activity Statistics in Scotland

Scott Heald to Ed Humpherson: Assessment of Accident and Emergency (A&E) Activity Statistics in Scotland