Alastair McAlpine to Ed Humpherson: Developing a modern statistical system – A review of Scotland’s Census 2022 – progress in meeting recommendations

Dear Ed,

In February, I wrote to you to let you know that our internal review of the Census had been published. This included four recommendations focussed on learning for the future delivery of strategically important statistical exercises across the Scottish Statistical System.

I have appreciated your support for this work and the recognition of my commitment to assure that leadership and statistical decisions in Scotland are made with appropriate oversight and seniority.

I am therefore writing to set out more detail on how I intend to meet the recommendations and the progress that has been made to date. Focus over the first three months has been on recommendation one with actions also initiated on recommendations two and four, as described in the table below. Work on recommendation three will commence in the latter half of 2025 building on the actions delivered.

RecommendationProgress to dateFuture plans
1. The Chief Statistician should consider activities including a task and finish group to define and identify the ‘Cross-cutting Statistical Components’ of the Scottish Statistical system and other projects.A small task and finish group was set up in March 2025 with representation from across Scottish Government analytical areas. The group produced proposed criteria and an initial list of cross-cutting statistical components which it presented to me in April.The criteria and list of components will be tested with senior statisticians and the Analytical Leadership Group, and the Scotstat Board will be kept informed. Once finalised the list of components will be published on our website and reviewed annually. We will share the list with OSR in a future update.
2. The Chief Statistician should consider opportunities to enable planning and training for the wider statistical profession to be linked to the broader skills needed for the delivery of cross-cutting statistical components by different organisations across the Scottish statistical system.A successful programme of leadership training is already being rolled out to B3 and C1 statisticians – ‘Fit for the Future’. I have tasked my team to explore options for similar development opportunities at C2 level and determining the skills needed at this grade. We have opened discussions with senior statisticians.Discussions with senior statisticians will inform our leadership training offer. Further information will be provided in my next update.
3. The Chief Statistician should consider what mechanisms are required to provide oversight and assurance of cross-cutting statistical components in a holistic way and escalation of key decisions that could impact an integrated statistical system.Prior to this review, the Office of the Chief Statistician had already been restructured to centralise professional functions and support. This work will improve the function that OCS provides on assurance.to the Chief Statistician. Draft guidance on assurance of topical and cross-cutting statistical matters has been developed.Now that a list of critical statistical components has been developed and statistical assurance guidance drafted, we will review it with senior statisticians and the Analytical Leadership Group, keeping the Scotstat Board informed. This will be a focus of our response to the review over the remainder of the current financial year.

I expect these planned actions to support the modernisation of our statistics, build trust and improve the transparency in our statistical decisions and methods.

I am copying this letter to the Minister for Parliamentary Business who has responsibility for statistics within his portfolio, Alison Byrne Chief executive of the National Records of Scotland, Scott Heald of Public Health Scotland, and the co-chairs of the Scotstat Board.

Yours sincerely,

Alastair McAlpine
Chief Statistician

Scott Heald to Ed Humpherson: Temporary suspension of accredited official statistics status of cervical cancer screening statistics

Dear Ed,

My team contacted the Office for Statistics Regulation in March this year when we became aware of an error affecting the data used in the Scottish Cervical Screening Programme statistics for the years 2016/17 to 2021/22i. These statistics report on the number of eligible women (and anyone with a cervix) who have a screening test and provide intelligence about the delivery of this vital public health intervention. The impact of the error on the previously published statistics is relatively small for national-level estimates and does not alter the overall conclusions drawn. The impact is somewhat greater for some sub-groups. It is important to note that none of these issues affected the running of the programme and all those eligible were invited for a cervical screen appropriately.

We have been sharing updates with you as our investigations of the error have progressed. Notices were also added to the previous publications and open data portal to alert users.

We discovered the error as part of our work to develop new statistics about the screening programme to reflect changes to the way it is now implemented and the availability of new data to measure its delivery. Following the discussion with your team on 8th May about publication dates, I am writing to confirm our plans and to request a temporary suspension of this series’ accredited status. The next publication will instead be labelled as official statistics in development due to the methodological changes we have made to these statistics (e.g. using a new data source), the introduction of new KPI measures, and the errors with the previous data. I would welcome a review by your team following the statistics’ publication to determine whether the accredited status can be reinstated following the work we have done to address the previous issues and better meet user needs.

We are planning to release the new statistics, and the revised estimates for the 2016-17 to 2021-22 period, on 29th July 2025, with a preannouncement on 28th May. An explanation of the revisions made and the issues we identified through our work to develop the new statistics will be outlined in an FAQ document to be published alongside the main statistics release. For additional transparency, the code used to identify the eligible population and report on the programme KPIs will also be published. An extract of the FAQ document is provided as an annex to this letter for your information.

I would like to thank your team for their advice and support during our investigation of this issue. I am copying this letter to Alastair McAlpine, Scottish Government Chief Statistician.

Yours sincerely,

Scott Heald

Director, Data and Digital Innovation

Head of Profession for Statistics

 

Related links: 

 

Cervical Screening Data FAQs

What happened?

· The previous method for producing these statistics relied on aggregate data extracted from systems by an external IT provider, who applied all selection criteria and calculations. The new statistics use individual-level data which enabled PHS to identify discrepancies in the previously supplied aggregate data. For example, the time periods for each year not matching the dates in the specification provided and errors in the criteria used to select the eligible population to be included in the statistics.

· In previous publications, the terms “coverage” and “uptake” were used interchangeablyii, although it has always been coverage which has been measured. Both will be presented in the new publication, as the new KPIs for the programme require reporting on both coverage and uptake.

· None of these issues affected the running of the programme and all those eligible were invited for a cervical screen appropriately.

What is different now?

· PHS now receive patient level data extracts and analysts apply agreed methodology to create the cervical screening statistics, including identification of the eligible population for coverage statistics.

· PHS has been working with a group of experts who work across the whole cervical screening and treatment pathway to develop and sign-off the methodology for the new statistics. The R code used to identify the eligible population and report on the programme KPIs will be published alongside the statistics.

Siobhan Tuohy Smith to Elaine Drennan: Compliance review of disability payment statistics produced by Social Security Scotland

Dear Elaine  

We recently completed a compliance check of Social Security Scotland’s Adult Disability Payment and Child Disability Payment statistics, which are both published as official statistics in development. To support their further development, we have carried out a high-level investigation into whether the statistics are being improved in line with the Code of Practice for Statistics. In this letter, we outline our findings and improvements to consider as your team continues its work on these statistics. 

These statistics show information on applications and payments for child and adult disability payments. The main messages from the statistics are clearly explained, with appropriate commentary. Suitable charts and tables are used to illustrate the statistics. We judge that the statistics are presented impartially and have been released in an orderly manner, in line with the Scottish Government’s statistics publication timetable. To inform users of any changes to past data, the outputs include a description of the revisions made. 

Suitable data sources are used to compile the statistics, and the team is planning to further improve how it explains the limitations of the data. The statistics are sourced from the person-level dataset held in Social Security Scotland’s own internal case management system. The data are extracted and securely transferred to the statistics team. There is some information published on the limitations arising from the methods used to compile the statistics, including bias and uncertainty. The statistics team told OSR that it is planning to publish more information about uncertainty in the data, potential biases and the resulting impact on the statistics, which is good practice. 

Currently, part of the data processing is manual, and the statistics team told OSR that it is in the middle of transferring to an automated statistics production pipeline. The plans to automate the data processing process are in line with good practice and should improve the quality of the statistics. Once the transition to automation is complete, Social Security Scotland should publish detailed information about the new quality assurance steps and stages. It should also consider publishing the software code used to produce the statistics. 

Engaging with users to understand their needs will enable Social Security Scotland to consider and prioritise developments in line with user need. The statistics team indicated that it is engaging with a range of users in central government, local government and the NHS. We also welcome the fact that users approach the statistics team in an ad hoc way and that the team asks for feedback to gather insight on how to develop the statistics. 

To further enhance the Trustworthiness, Quality and Value of these statistics, we have identified the following improvements for your team to consider: 

  • To provide users with a wider view of the disability benefits landscape, Social Security Scotland should consider providing information about other UK disability payments statistics in the release and commenting on how the statistics should or should not be compared. 
  • Publishing more detail about the quality assurance processes would provide greater assurance to users. For example, Social Security Scotland should include a process map, identify potential sources of bias and error, and describe the actions taken to minimise risks to data quality. 
  • It would help potential new users if Social Security Scotland published more information about its statistics user engagement plans, so that users know that it are willing to hear their views. 
  • To support the development of these statistics, and in line with our guidance about official statistics in development, Social Security Scotland should consider publishing more detail about its plans to develop these statistics, including timelines and how users will be involved the process. 

I would like to thank your team for its positive engagement with us during this review. We will continue to engage with you as you develop these statistics. Once you consider that these statistics, or other statistics in your portfolio, are ready for a full assessment against the Code, please do not hesitate to get in touch. 

I am copying this letter to Alastair McAlpine, Chief Statistician of the Scottish Government. 

 

Yours sincerely 

Siobhan Tuohy-Smith 

Assessment Programme Lead

Related Links:

Letter: Ed Humpherson to David Wallace: Transparency and user engagement in the production of official statistics

 

Siobhan Tuohy-Smith to Stephanie Howarth: Statistics on Council Tax in Wales

Dear Stephanie 

We recently completed a compliance check of council tax statistics produced by the Welsh Government against the Code of Practice for Statistics. We completed this review as part of a series of compliance checks on council tax statistics produced in Great Britain (GB). The scope of this review included the annual dwellings, levels and collection rates outputs. These statistics are valuable to users who are trying to understand how council tax affects them. Furthermore, these statistics help the public hold the Welsh Government to account by keeping a public record of the current condition of the Welsh council tax system. The statistics also have considerable potential value in helping to inform public debate around planned changes to Welsh council tax policy, property revaluation and bandings. 

Our review found a mixture of positive practices and areas where the statistics could be improved. From our engagement with your statisticians, we are assured of the quality of the statistics. They are largely based on financial information from audited local government accounts, and there is a comprehensive approach to quality-assuring the incoming data. Your team has also committed to improve the presentation and value of the statistics as part of its plans to improve their accessibility by publishing each bulletin in HTML format in the first half of 2025. We have therefore concluded that the statistics should retain their accredited official statistics status.  

We found helpful information about the high 100% response rate from local authorities in each publication, and the comprehensive list of quality assurance checks performed on the incoming data helps to reassure users about your approach to quality management and understand the quality of the statistics. It is also good that if quality issues are identified during data validation, these are communicated to the data provider and the issues can be rectified. Furthermore, using automatic data validation and feedback mechanisms helps to ensure data quality and improves the efficiency of the production process. We also commend the use of RAP principles and encourage the team to be transparent about this to demonstrate the quality of the statistics.  

We welcome your team’s commitment to improve the presentation and value of the statistics. The statistics are published as PDF summaries and contain a variety of breakdowns and supporting commentary. The statistics are also available via StatsWales, which allows users to view a variety of customisable tables. The PDF summaries provide useful background information around how Welsh council tax works and charts are used effectively to demonstrate trends. We discussed with the team where including additional charts and commentary on key trends and the policy context in Wales could further highlight the relevance and value of these statistics for a broader range of users – for example, on the Welsh Government’s previous and planned changes to council tax policy. Including links between the summaries to other relevant content, such as the summary quality report, published pre-release access lists and data on StatsWales, would also make it easier for users to access this information from the statistics and further support their appropriate interpretation. We look forward to seeing these improvements for each publication, as part of your team’s plans to publish the statistics in HTML format, starting with the dwellings statistics in January 2025. 

The following paragraphs highlight other improvements that we consider would further enhance the trustworthiness, quality and value of the statistics. 

Helpful comparisons of council tax statistics between local authorities are included, but there should be more information on the extent of comparability of council tax statistics between GB nations. A comparison to council tax in England is made in the levels publication, though this should be reviewed before the next publication due to divergences between Wales and England council tax policy, including property revaluation in Wales in 2003, which was not replicated in England. This comparison will also become less viable should the Welsh council tax system change again in the next few years. We highlighted to the statistics team some recently published guidance by Scottish Government statisticians on the comparability of council tax statistics between GB nations. We encourage your team to engage with its counterparts in England and Scotland, to achieve consistent guidance to support users of council tax statistics across GB in making appropriate comparisons. 

While we understand there is no sampling error due to the 100% return from councils, uncertainty can arise in other ways. We recommend that the team investigate where potential sources of error and uncertainty in the administrative source data may exist and use this information to improve the communication of uncertainty in the statistics. The team may find our Quality Assurance of Administrative Data (QAAD) toolkit and Approaches to presenting uncertainty in the statistical system insight report useful to identify where potential uncertainty or bias may need to be communicated. In addition, the summary quality report covers multiple publications and appears outdated. In its current format, the information is quite general, which makes it hard for users to find more-specific quality information relating to individual publications. It should be reviewed to ensure users have current information on quality, to support an appropriate interpretation of the statistics. 

Finally, while the statistics include an email address for users to provide feedback, the team could more proactively seek to increase its understanding of users’ needs, so that the statistics remain relevant and useful. This will be especially important given planned changes to Welsh council tax policy and related developments to the statistics. We recommend that the team explicitly set out its approach to user engagement and proactively seek feedback on the changes made to the statistics, following their publication in HTML. 

I would like to thank your team for their positive engagement during this review. Our Housing, Planning and Local Services team has provided some detailed points of feedback to your statisticians on the presentation of statistics and will keep in touch with them as they take these recommendations forward. Please get in touch if you would like to discuss any aspects of this letter or if we can be of further assistance. I am copying this letter to Anthony Newby, the responsible statistician. 

 

Yours sincerely 

Siobhan Tuohy-Smith 

Assessment Programme Lead 

Mark Pont to Sandra Tudor: Compliance Check of Social Housing Lettings in England

Dear Sandra

Statistics on social housing lettings in England

We have recently completed a compliance check of your statistics on social housing lettings in England against the Code of Practice for Statistics. We found that these statistics are a good demonstration of how to produce and publish official statistics, and the statistics should keep its accredited official statistics status. Our review found a range of positive features that demonstrate the trustworthiness, quality and value of the statistics.

These statistics transparently show the availability of social housing lettings in England and the characteristics of those renting these dwellings. The statistics are presented in two separate bulletins on tenants and tenancies that include a variety of breakdowns and are presented in an accessible and informative way. We found a good balance of statistical content and commentary that will help users to interpret the statistics appropriately and understand what they show. For example, comparisons are made between social rent rates and private rent rates in different geographical locations.

The technical notes provided with the publication contain much information that helps users understand the strengths and limitations, which in turn supports the appropriate use of the statistics. For example, the information on the use of weighting and imputation for certain measures is accessible to both expert and non-expert users. This methodological information indicates the level of uncertainty in the statistics, enabling their appropriate use.

As well as the qualitative indications of uncertainty provided, the quantitative information on uncertainty given in the quality data tables is a fantastic resource for expert users who wish to use the statistics for analysis. Confidence intervals tables are presented for specific figures which are imputed, allowing users to understand the extent of missing data in CORE and how much has been imputed.

Between the 2022 and 2023 publications, you undertook a large amount of work to develop both the presentation of the statistics, and the CORE system. The improvement of presentation and accessibility is exemplary and has enhanced the usability of the statistics. Separating the statistics into bulletins on tenants and tenancies allows more space for interesting analysis on each topic area in an accessible way. Furthermore, the statistics are easier to navigate, and a clear distinction is made between the statistics on tenancies and tenants, each of which has distinct uses.

We were also told by the statistical team that there have been improvements to efficiency and quality assurance because of these recent developments. Question routing and inference is now possible, which has reduced respondent burden for existing tenants, as questions can be skipped in appropriate circumstances and information from previously completed fields can be carried forwards. The improvements to the CORE data collection, have made it cheaper to run and less dependent on experts, showing your commitment to ensuring value for money and efficiency. The team also told us that it plans to carry out quality assurance of LA data submissions mid-year, allowing for quality issues to be caught early.

Your plans to set up an ad hoc requests database that will bring together data from different types of information requests, including direct ad hoc analysis requests and those made through FOIs, will provide a better overall user experience. Constructing this as a database will reduce the number of repeat requests for information; improve efficiency by limiting the need to recompile information; and ensure that previous information is not lost.

To further enhance the trustworthiness, quality and value of these statistics, we have identified ways the statistics and their presentation could be improved:

  • Though the communication of uncertainty in the technical notes is excellent, the communication of uncertainty within the tenants and tenancies bulletins could be improved. It could be clearer that the quoted figures are estimates with varying degrees of uncertainty. Communicating the level of uncertainty in the statistical bulletins will help to ensure an accurate interpretation of the statistics.
  • It would help potential new users if you more explicitly communicated your approach to user engagement, so people know how to get in touch and that you are open to hearing their views. Proactively engaging with external users will allow the statistical team to gather feedback from a broader range of sources and facilitate an ongoing dialogue around the further development of the statistics.
  • Providing more links between the different social lettings outputs, for example, to the data tables and technical notes in the bulletins, would enhance the accessibility of the full range of outputs for users.

I would like to thank your team for its positive engagement during this review. We recommend the team looks for opportunities to champion the value of the recent developments, for example, through the publication page itself, and through cross-government channels such as the GSS housing and planning steering and working groups. Please get in touch if you would like to discuss any aspects of this letter. I am copying this letter to Richard Field, Head of Housing and Planning Statistics, MHCLG; and Rachel Worledge, Lead Statistician, CORE Social Housing, MHCLG.

Yours sincerely

Mark Pont

Mark Pont to Jonathan Waller: Compliance Check of Higher Education Student Statistics

Dear Jonathan,

Compliance Check of Higher Education Student Statistics

We recently completed our compliance check of the Higher Education Student Statistics and are pleased to confirm that these accredited official statistics continue to meet the standards of trustworthiness, quality and value as set out in the Code of Practice for Statistics.

This compliance check was originally initiated in September 2023 following the merger of the Higher Education Statistics Agency (HESA) and Jisc in 2022 with the aim to ensure that the statistics continued to meet the standards expected of accredited official statistics. During this time, the first data collection under the new Data Futures model was underway, which experienced a number of challenges. As a result, we shifted the focus of this compliance check to primarily consider the Quality pillar of the Code of Practice.

We previously conducted a compliance check on these statistics in 2018. Since then, you have improved the language and format of the summary statistics by including additional information and definitions throughout the bulletin, which support users’ interpretation and understanding of the statistics. It is good that you have also reduced the number of people who have pre-release access to the statistics. In addition, it is encouraging to see that it is now easier to access and navigate the quality report.

The Data Futures programme aims to transform the collection, processing and analysis of data in the Higher Education sector and has been in development since 2015. Following a period of considerable communication with the Higher Education sector, the new data collection was introduced in 2023 to collect data for the 2022/23 academic year. The implementation of the new data collection faced several challenges that had the potential to impact the number of Higher Education providers who submitted data and the quality of the data received. We commend the dedication of Jisc, working with statutory bodies and HE providers throughout the data submission period which ensured that all required UK Higher Education providers were able to submit data for inclusion within the 2022/23 student data set.

Following data submission, your staff conducted an unprecedented level of quality assurance to ensure that the data were fit for purpose. Your January 2024 blog, and the quality report published with the statistics, sets out the steps that you have taken to maximise the quality of the data, including:

  • Field-by-field comparison with year-on-year variances which involved comparing the distribution of field categories with equivalent data from previous years
  • Continuity testing by linking records for each student to their records in previous years to ensure that relevant characteristics have been recorded consistently over time
  • Comparisons of aggregated statistics with previous years.

By proactively and openly sharing information with users on the quality assurance process, you have supported continued trust in the statistics and provided users with valuable information that will inform how they use the statistics.

We consider that Jisc has responded well to the challenges faced by the introduction of the Data Futures programme, and that Jisc has taken appropriate steps to ensure that the statistics meet the standards of the Code.

To improve future data collections, we encourage Jisc to reflect on the lessons learned from this process to draw valuable insights, in conjunction with the findings from the Office for Students’ UK-wide independent review once completed.

I would like to thank you and your colleagues for their positive engagement throughout this review process. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter further.

Yours sincerely

Mark Pont

Mark Pont to Victoria Obudulu: Winter Coronavirus (COVID-19) Infection Study

Dear Victoria

Compliance review of statistics in development from the Winter Coronavirus (COVID-19) Infection Study, England and Scotland

Thank you very much for inviting us to independently review the trustworthiness, quality and value of the statistics in development from the Winter Coronavirus (COVID-19) Infection Study (Winter CIS). The Winter CIS was jointly developed by the Office for National Statistics (ONS) and the UK Health Security Agency (UKHSA), covering households in England and Scotland. On 21 December 2023, using raw survey data from the Winter CIS, UKHSA first published a series of reports titled Estimates of Epidemiological Characteristics.

Information about COVID-19 (that can be caused by the coronavirus SARS-Cov-2) continues to be of high interest for expert users, such as public health and NHS officials, health researchers and the media, who have used your Winter CIS estimates over the winter for a number of reasons:

  • to act as an early warning for any potential outbreaks of COVID-19 for those planning and delivering the NHS and other services preparing for winter stressors.
  • to observe the prevalence of coronavirus variants in the community.
  • to supplement and triangulate existing coronavirus community surveillance data, such as acute respiratory infection incidents.
  • to raise public awareness of COVID-19 infection levels in England and Scotland.

Involving users to assist with continuing development

I welcome the innovation and agility shown by your team to develop the analyses throughout the study period. This has been demonstrated on two fronts: through transparent joint and collaborative working, with continuous learning, allowing your team to iterate your working processes; and by trialling a different methodology while encouraging expert user feedback.

We note from a recent Blog by ONS that statistics to capture information about respiratory viruses will continue to evolve and we consider that retaining the label ‘official statistics under development’ seems appropriate. To inform future iterations of infection studies more generally, it is good practice to continue to engage with users of the Winter CIS to help develop the methods, data and metadata transparently.

Coherence across the UK

Both ONS and yourselves have been very clear to link to other statistics about the coronavirus and COVID-19. We welcome this approach to provide coherence with existing coronavirus data from the rest of the UK.

Users of the Winter CIS have identified the need for comparable statistics across the UK, particularly to understand the effects of coronavirus on the lives of individuals. During any future planning for a study of this nature, we consider that time should be taken by statistics producers to fully consider the possibilities for the development of coherent UK-wide statistics.

Ensuring transparency of methods

It is good that you have published a Quality and Methodology Information report which explains the uncertainty, strengths and limitations of the estimates that you have published. To produce these modelled estimates, you used Multi-level Regression with Poststratification (MRP) methodology. The statistics team told us it has submitted the methodology and data for academic peer review, which is expected to be published and so made available for public scrutiny, which is good practice for official statistics in development.

Insight and forward plans

Over the course of the study, as well as descriptive analyses about the latest estimates, you have published an Explainer article and followed up with news releases so that summary information is made more accessible for the less expert user. Once you have finalised publication dates for future research or analyses using Winter CIS data, such as the modelled Infection Hospitalisation Risk for Scotland, as well as using informal ways to let people know, it would be good practice to also pre-announce these as ad hoc official statistics.

I would like to thank your team for its positive engagement with us during this review. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter.

Yours sincerely

Mark Pont

Mark Pont to James Tucker: Winter Coronavirus (COVID-19) Infection Study

Dear James

Compliance review of statistics in development from the Winter Coronavirus (COVID-19) Infection Study, England and Scotland

Thank you very much for inviting us to independently review the trustworthiness, quality and value of the statistics in development from the Winter Coronavirus (COVID-19) Infection Study (Winter CIS). Information about coronavirus continues to have high interest for expert users, such as public health and NHS officials, health researchers and the media.

The Winter CIS was jointly developed by the Office for National Statistics (ONS) and the UK Health Security Agency (UKHSA), covering households in England and Scotland. Your team published data from the study for transparency every two weeks from 7 December 2023 to 14 March 2024. Raw results from the study were also sent securely to the UK Health Security Agency for onward analysis to allow that organisation to estimate the incidence and prevalence of coronavirus SARS-CoV-2, which can cause COVID-19.

I welcome the innovation and agility shown by your team. This has been demonstrated through transparent joint and collaborative working with the UK Health Security Agency, with continuous learning, allowing your teams to iterate your working processes. We note from your recent Blog that statistics to capture information about respiratory viruses will continue to evolve and we consider that retaining the label ‘official statistics in development’ seems appropriate.

Explaining study representativeness

As you know, in random sample surveys it is important that the achieved sample adequately represents the proportions of various characteristics (age, ethnicity, locality) shown by the community under scrutiny. This will reduce biases to the survey results and provide a closer estimate to the true value. The design of the Winter CIS as a longitudinal panel survey, since it is based on a cohort of people who responded to the Coronavirus (COVID-19) Infection Survey (CIS), means that it is not a random sample survey.

Even with an imbalanced sample, appropriate weighting helps to ensure representative estimates. You have gone some way to providing transparency about the accuracy of the Winter CIS estimates, by publishing the survey results data, the response rates for various survey waves and the population basis that was used to calculate the estimates. You have also published a Quality and Methodology Information report (QMI) which makes clear that the survey might over-represent or under-represent certain groups and sets out information about what the survey data can and cannot be used for. The QMI provides some detail on the data collection procedures, quality assurance processes, response rates and representativeness. Please ensure it is updated to include more detail to fully explain how you have reduced biases in the design and weighting of the study, as outlined in the paragraph below.

To assist further development of these and other statistics based on the Winter CIS sample frame, (such as the ad hoc release titled Experiences of GP practice access), it would be helpful for researchers to fully understand the various recruitment processes, the study design, the methodological procedures and the weighting choices that have been taken. For both the Winter CIS and other releases that use the Winter CIS as a sample frame, you should publish information about:

  • The distinction between the processes of participant recruitment and engagement, giving consideration to digital inclusion and the impact of incentives, or their removal.
  • The maturation of the dataset over time, to understand the personal characteristics of the respondents over subsequent waves of the study.
  • The statistical inferences made during the course of calculating the estimates, including the weighting decisions.

We are pleased to hear that you have agreed to publish updated metadata by the end of June and place the final survey results and weighting data into a repository to further academic research on it, such as ONS’s Integrated Data Service.

Coherence across the UK

Both UKHSA and yourselves have been very clear to link to other statistics about the coronavirus and COVID-19. We welcome this approach to provide coherence with existing coronavirus data from the rest of the UK.

Users of the Winter CIS have identified the need for comparable statistics across the UK, particularly to understand the effects of coronavirus on the lives of individuals. During any future planning for a study of this nature, we consider that time should be taken by statistics producers to fully consider the possibilities for the development of coherent UK-wide statistics.

I would like to thank your team for its positive engagement with us during this review. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter.

Yours sincerely

Mark Pont

Mark Pont to Alastair McAlpine: Compliance Check of the Scottish Health Survey Statistics

Dear Alastair

Compliance Check of the Scottish Health Survey Statistics

We recently carried out a compliance check of the Scottish Health Survey (SHeS) statistics against the Code of Practice for Statistics. These statistics were designated as National Statistics in 2010, and we are pleased to confirm that they should continue to be accredited under the status now known as accredited official statistics (AOS).

The SHeS is a longstanding annual survey in Scotland providing data about the health of the population on a diverse range of topics including mental and physical health; diet and food security; and alcohol and smoking. It therefore provides a vital picture of the health of the Scottish population living in private households. It is used by the Scottish Government, Public Health Scotland and many health organisations, including charities for planning or for decision-making. The Scottish Parliament also regularly refers to the survey results when debating health issues. Endorsed by the Chief Medical Officer in Scotland, the SHeS has a Project Board consisting of stakeholders, the survey contractor and users which helps to increase confidence in the statistics. The Project Board also reviews the survey topic questions used each year to ensure that they remain relevant and are both appropriate and proportionate for the survey interviews, which span multiple age groups.

In carrying out our review, we consulted some users and took their views into account. We have set out our findings along with some recommendations in the annex that we consider would help to further improve the statistics. We will follow these up informally with the team as it works towards the next release.

I would like to thank your team for its positive engagement with us during this review. Please do not hesitate to get in touch if you would like to discuss any aspects of this letter.

Yours sincerely

Mark Pont

Assessment Programme Lead


Annex: OSR review findings and recommendations – Scottish Health Survey (SHeS)

Trustworthiness (T) findings

T1. The Chief Medical Officer for Scotland endorses the Scottish Health Survey (SHeS) in the Foreword. This acknowledgement credits the insights deduced by the collaborative efforts of the stakeholders involved in the production of the survey and the time given by the participants to inform future health policies. The team has a Project Board with representation from many key user organisations as well as other stakeholders closely aligned to the survey. The team sets out suggestions for changes and the Board reviews them and implements accordingly. This oversight group helps to increase user confidence in the statistics. Both of these examples demonstrate trustworthiness through transparent and independent decision-making and leadership.

T2. The Project Board decides on the questionnaires’ content, ensuring that the questions are relevant, the questionnaire lengths are appropriate for the varying age groups, and the respondent burden is minimised. The team undertook a questionnaire content review to learn about users’ views on the content of the SHeS. However, although the review responses summary has been published, the report summarising the updated questionnaire content is yet to be published so users remain unaware of any changes planned. We welcome the publication of this report in a timely manner, particularly as users reported that they were uncertain whether their suggestions had been taken forward.

Recommendation for improvement: T2

To make the decisions behind changes to the survey questions more transparent to users, the Scottish Government (SG) should ensure that the content report includes the criteria used to determine whether questions are continued, altered, added or removed each year. Where there are significant changes, these should be clearly highlighted and evidenced, establishing an audit trail to inform current and future users.

Quality (Q) findings

Q1. We found that there was much effort put into describing the processes used to recruit survey participants in detail and overcoming the challenges during Covid-19 in the Technical Report, all of which is helpful information for the reader. The fieldwork methodology report describes in detail how required sample sizes were achieved and how households were selected using the Postcode Address File.
However, the information on the processes for sample design and actual respondent engagement appears to be interspersed in some parts of the methodology. Thus, some users might find it challenging to distinguish and fully appreciate the technical aspects of the survey, such as:

  • The recruitment procedures
  • The sample estimation
  • The collection and treatment of data

Recommendation for improvement: Q1

SG should seek feedback from a range of users about its methodology documentation and use this feedback to update the documentation accordingly to ensure it is as effective as possible.

Q2. We found that there are multiple references to the term ‘core questions’, which generally refers to all the questions asked in the relevant year to both adult samples. These references are further complicated by the fact that only a small number of questions included in the Scottish Health Survey, the Scottish Household Survey and the Scottish Crime and Justice Survey as part of the Scottish Surveys Core Questions are also referred to as the ‘core questions’. To aid user understanding, SG could clarify which set of ‘core questions’ are being referred to each time the term is mentioned to avoid any confusion between the different sets of questions.

Recommendation for improvement: Q2

As part of its wider user engagement, SG should determine whether users understand the references to ‘core questions’ throughout the documentation. If the similar terminology is found to confuse readers, SG could consider the use of alternative language that more clearly differentiates between the sets of questions.

Q3. The new dashboard provides 95% confidence intervals, which we heard was useful to many users. We found varying levels of the representation of uncertainty throughout the survey outputs, but the summary report doesn’t contain any information about uncertainty. Users told us that they would find it helpful to see confidence intervals in the supplementary tables as well.

Recommendation for improvement: Q3

SG should include confidence intervals in the supplementary tables to ensure users can better understand the quality of the estimates. SG could also review the presentation of uncertainty in other outputs and consider whether including further information about uncertainty might be helpful.

Q4. In the main report, we found that in some visual representations, the changes in attained values or scores can be ambiguous. For example, for the average mental wellbeing scores, the mean Warwick-Edinburgh Mental Wellbeing Scale (WEMWBS) scores given on the associated Mental Health and Wellbeing chart are not clearly labelled or titled, and the decline in average scores between 2019 and 2022 could be open to misinterpretation. It is important that users clearly understand how to interpret the score changes from year to year in the time series analysis represented in charts.

Recommendation for improvement: Q4

To provide users with sufficient information to correctly interpret scores or values given in the chart, SG should ensure that all charts are clearly labelled with explanations of whether yearly fluctuations are statistically significant or not. This should increase the likelihood of accuracy of when data and charts are interpreted.

Q5. We noted that the team included a gender identity question between 2018 and 2021 onwards as part of the self-completion questions as this was an SG Scottish Survey of Core Questions (SSCQ) harmonised question. However, this question was removed from 2022 as it was no longer an SG SSCQ core question and was not being used for SHeS analysis purposes. The survey continues to include a question on sex (with possible responses ‘male’, ‘female’, ‘prefer not to say’) as this is an important characteristic for analysis of the survey results. The team told us that it would consult our recent guidance on Collecting and reporting data about sex and gender identity in official statistics when considering any future data collection changes in this area.

Value (V) findings

V1. In recent years, the SHeS has undergone some methodological changes, and during the Covid-19 pandemic in 2020, field work was suspended and a telephone survey was conducted instead based on a shortened version of the questionnaire and some Covid specific questions. The 2020 data were presented as experimental statistics due to differences in the profile and bias in the achieved sample compared to pre-Covid years and have not been included in the time series analysis since then. The absence of these data is clearly explained so that users are aware of why there is a gap in data tables, charts and on the dashboard.

V2. We were pleased to hear that the value of the SHeS data has been enriched by further data linking. The team told us that work done to link the Postcode Address File (PAF) data to the Community Health Index (CHI) database has increased the likelihood of identifying households with children under 16 making it a more efficient way to source child samples. There is also information explaining how the SHeS and health record data are linked and detailing the variables included. This will improve researchers’ access to the datasets, helping to obtain further valuable insights into the health of Scotland’s population.

V3. The objectives of the survey are clearly outlined in the main report. Users told us that they sometimes use other survey findings to complement the findings of the SHeS; for example, they might use the Alcohol Toolkit Study by the University College London  when determining trends in alcohol use. Others mentioned that further information clarifying whether other sources and trends could be used or not would be helpful.

Recommendation for improvement: V3

To support SHeS users in considering how they might use SHeS statistics alongside other non-official sources, SG should provide further guidance, where practicable, about the extent to which SHeS data could be used alongside other reputable sources.

V4. SG has developed an innovative SHeS dashboard which provides comparative data going back to 2008 and is presented at national, local and health board levels. It is encouraging that RAP processes are being implemented and that further development work is planned. We heard that users have requested further breakdowns and were not always clear on whether they were being considered. It is encouraging to hear that SG intends to consult users on its plans to provide further breakdowns, such as employment type, urban/rural and health board/local authority data by the Scottish Index of Multiple Deprivation (SIMD).

Recommendation for improvement: V4

As SG develops its dashboard plans, it should communicate these so that users are more aware of the plans to progress their requests.

V5. Having comparable health statistics enables users to more easily observe trends across the UK countries. We found differences in the comparability and coherence of the survey across the survey topic areas that would hinder users in drawing such inferences.

Recommendation for improvement: V5

To provide further explanation on when statistics on a specific topic can be compared across UK countries, SG should include a more in-depth section on comparability. This will help to give users greater insight on the extent to which they can compare SHeS topic areas with other similar survey outputs in countries across the UK.

Ed Humpherson to Jonathan Waller: Assessment of the Higher Education Graduate Outcomes Data and Statistics

Dear Jonathan

Assessment of the Higher Education Graduate Outcomes Data and Statistics

We have completed our assessment of the Higher Education Graduate Outcomes Data and Statistics produced by Jisc, under the Higher Education Statistics Agency (HESA) brand. I am grateful for the positive engagement from your team throughout the assessment process.

Since their introduction in June 2020 for the 2017/18 academic year, these statistics have continued to be an invaluable source of information on the outcomes and destinations of graduates from Higher Education (HE) in the UK. These statistics are a vital source of information on graduate outcomes in the UK and are used by a wide range of stakeholders including higher education providers and regulatory bodies across the UK.

Jisc has shown a commitment to trustworthiness through its understanding of its responsibilities in important areas such as data security, professional capability, and intelligent transparency. By fully maximising the positive relationships with its key stakeholders, Jisc has ensured that the needs of users are embedded in the production and continued development of these statistics.

The quality of these statistics is very high. The survey is well designed to meet users’ needs and achieves good response rates at a national level. We commend the transparency around the quality of these statistics and that a large amount of information is provided in the accompanying user guide.

Jisc’s engagement with its statutory customers and other key users through the provider forum, steering group and other means, demonstrates a commitment to strong user engagement practices. We welcome Jisc’s plans for process modernisation including the implementation of Reproducible Analytical Pipelines (RAP) as well as the Graduate Outcomes Data being made available for future research, including data linkage, on the Office for National Statistics’ Secure Research Service.

Our assessment found that these statistics adhere to the Code of Practice closely. Through the range of actions that Jisc has taken to develop these statistics, and in its response to our assessment, Jisc has shown a high level of dedication to the principles of trustworthiness, quality and value.

We judge that the Graduate Outcomes Data and Statistics can be confirmed as accredited official statistics (called National Statistics in the Statistics and Registration Service Act 2007). We would like to acknowledge that it is rare for OSR to accredit official statistics without first providing producers with requirements and that this reflects our high regard of these statistics as well as Jisc’s approach to continually maintaining the high standards of the Code of Practice for Statistics.

Yours sincerely

Ed Humpherson

 

Related Links:

Assessment of compliance with the Code of Practice for Statistics: Higher Education Graduate Outcomes Data and Statistics

Jonathan Waller to Ed Humpherson: Higher Education Graduate Outcomes Statistics Assessment Request – October 2023

Ed Humpherson to Jonathan Waller: Higher Education Graduate Outcomes Statistics Assessment Request – October 2023