Ed Humpherson to Neil McIvor: Temporary exemption from Code for DfE attendance statistics

Dear Neil,

Following conversations with your team, I am happy to confirm the exemption from the Code of Practice for Statistics’ standard publication time of 9.30am to permit a later release time of noon each Tuesday for attendance in education and early years settings during the Coronavirus (COVID-19) pandemic statistics.

We welcome your decision to start publishing these statistics as official statistics.  This demonstrates a clear commitment to ensure that the statistics adhere to the principles and practices of the Code.

I am copying this to David Simpson, acting Deputy Head of Profession for Statistics.

 Yours sincerely

Ed Humpherson

Director General for Regulation

 

Related links:

Rapid Review of attendance in education and early years settings during coronavirus

Ed Humpherson to Neil McIvor: Coronavirus (COVID-19) attendance in education and early years settings

Dear Neil

CORONAVIRUS (COVID-19) ATTENDANCE IN EDUCATION AND EARLY YEARS SETTINGS

 I am writing to endorse the approach taken by DfE in the recent publication of data on attendance in education and early years settings during the Coronavirus (COVID-19) pandemic. I would like to congratulate everyone involved for their work to produce these timely and valuable statistics in challenging circumstances.

My team has conducted a rapid regulatory review of the published information. We have reviewed the extent to which they have been produced in accordance with the Code of Practice’s Trustworthiness, Quality and Value pillars, while taking account of the pressures you and your teams have faced to deliver timely statistics about a rapidly evolving situation.  A summary of our findings is set out below and more detailed feedback has been provided to your team.

Value

  • We welcome the rapid development and publication of these data to support understanding of the impact of COVID-19 on school attendance in England. The online data collection portal and guidance to educational settings on filling out the data were created quickly to aid this development.
  • We encourage you to continue to review the information collected and the data that you publish, considering user needs as the current situation evolves.

Quality

  • The methodology used is clearly set out within the publication, and aids interpretation of the statistics. It is impressive how promptly the methodology was developed given the quick production of these data, and how the methodology used is comprehensively explained within the publication.
  • Currently, quality assurance checks are carried out on a manual basis on the data. The automated checks in the production process that you are planning to develop will help provide further reassurances around quality. Establishments will also welcome you minimising the burden of additional data checking during this challenging time.

Trustworthiness

  • The publication is transparent around the possible undercounting of vulnerable children. It is particularly good that you have received positive feedback on the inclusion of vulnerable children within the data and that you are considering what more could be done in this area.

We look forward to seeing these statistics develop as circumstances change. As set out in the guidance on changes to statistical outputs you can include a statement in your release such as “These statistics have been produced quickly in response to developing world events. The Office for Statistics Regulation, on behalf of the UK Statistics Authority, has reviewed them against several key aspects of the Code of Practice for Statistics and regards them as consistent with the Code’s pillars of Trustworthiness, Quality and Value.”

I am copying this to David Simpson, acting Deputy Head of Profession for Statistics and Jonathan Slater, Permanent Secretary for the Department for Education.

Yours sincerely

Ed Humpherson

Director General for Regulation

Review of DfE Experimental School Funding Statistics

Dear Neil

STATISTICS ON SCHOOL FUNDING IN ENGLAND

I am writing to you to summarise the Office for Statistics Regulation’s review of the Department for Education’s new experimental statistics on school funding. Overall, these new statistics are very welcome, addressing a gap in the provision of official statistics on schools and helping support informed debate. The letter outlines the positive aspects of the statistics that we identified, including around accessibility and user engagement, and also aspects that could be improved, especially fuller presentation of the context surrounding school funding.

The background is that the Authority has previously written to the Department for Education regarding concerns with the way school funding statistics have been used within public debate. Until recently, there has been no single consistent and comprehensive set of official statistics on school funding, to which all participants in public debate could refer and sources previously available all presented a different analysis of school funding figures. It has therefore been difficult to form an authoritative view of school funding. To inform public debate on public services expenditure such as school funding, there is a vital need for good quality and clearly explained statistics. In May 2019, we wrote a letter to the Department, recommending the publishing of set of official statistics that comply with the Code of Practice for Statistics.

The Department’s new experimental statistics on School funding: between financial years 2010 to 2011 and 2020 to 2021 addresses the gap in these statistics. We have reviewed the statistics against the code of Practice for Statistics against the three pillars, trustworthiness, quality and value. Our main findings are:

  • The statistics acknowledge the difficulty that users may face with understanding and navigating the existing statistics on school funding and clearly explain the different sources and elements of school funding used. It provides information on policy and operational changes to school funding arrangements that impact the comparability of data over time.
  • Data are presented in both cash and real terms and cumulative allocations of school funding have been broken down to allow users to understand year on year changes. Assumptions of imputed or estimated data is explained however, it would be helpful to expand these explanations to provide supporting evidence for the methodological assumptions. For example, top-up funding for post-16 students contained within high needs block has been imputed based on the assumption that funding grew at the same rate as in previous years.
  • Accessibility for users has been considered. The ‘Technical Information’ section includes hyperlinks to related DfE publications and the accompanying data tables. To further improve accessibility, we would encourage better signposting throughout. This includes signposting to related DfE publications and tables within the main body of text; to the relevant data tables where the chart source is currently “DfE” and to links within the data tables to the statistical release definitions section to aid users who might come across the data tables before reading the release.
  • The statistics team has engaged with some known users for feedback, including the Institute for Fiscal Studies and are aiming to publish earlier in the academic year to enhance timeliness for users. We would encourage the statistics team to continue to seek feedback from a range of users to understand the relevance and public value of these statistics.

We welcome the publishing of these statistics by the Department and are grateful for how effectively the team engaged with us during the statistics development and for inviting us to provide feedback. We will keep in touch with the statistics team as part of our ongoing monitoring of statistics on school funding.

I am copying this letter to Mike Jones, Deputy Head of Profession for Statistics.

Yours sincerely

Ed Humpherson

Director General for Regulation

 

Related Links:

Department for Education funding statistics (May 2019)

 

Ed Humpherson to Neil McIvor, DfE regarding the Higher Education Initial Participation Rate (HEIPR)

Dear Neil,

We have received a query about your Higher Education Initial Participation Rate (HEIPR) release. Thank you for the help your team has provided as we have been looking into this query.

Having reviewed the latest release, I share some of the concerns raised with us and would like you to consider the methods and presentation of the HEIPR.

We understand that the HEIPR has been published as a National Statistic by the Department for Education (and former departments) since 2004 following a National Statistics Quality Review, which considered it an appropriate measure of progress towards the previous ambition to see 50 per cent of under 30s going to higher education.  Maintaining a consistent approach to the measurement since then has ensured progress towards that goal could be tracked.

We are also aware that your approach to producing an estimate broadly aligns with the approach taken by OECD (Organisation for Economic Co-operation and Development) to make international comparisons of ‘First-time entry rates to tertiary education’ (‘Education at a glance’).  We encourage producers of statistics to follow international best practice and align methods to allow comparison whenever possible.

However, we do not consider the publication to be clear on the purpose of the HEIPR metric and what it intends to measure. It would be helpful to have more information to support users in use of these statistics. We would also encourage you to review use of the term ‘rate’ and consider an alternative description.

Your team highlighted the OECD working group paper to prioritise the development of graduate entry rate indicators, particularly regarding their interpretability. The aim of the paper was to provide a clearer understanding of the current indicator, including its underlying assumptions. It clearly recognises the difficulty in interpreting this metric – mirroring our own findings. It also highlights the limitations of the calculations, including that the rate can exceed 100%. While you do provide an explanation of the methods used to calculate the HEIPR, given the complexity and the risk of misinterpretation by users, we would encourage you to improve your methodology guidance notes.

We understand that you are reflecting on the future relevance of the measure and will consult users in any developments.  We would be grateful if you could keep us appraised of your plans.   We are considering a compliance check of the HEIPR statistics as part of our 2020-21 regulatory and will work with your team to consider how it can get most value from this.

Your sincerely

Ed Humpherson

Director General for Regulation

 

Related Links:

Concerns about Participation Rates in Higher Education (November 2019)

Concerns about Participation Rates in Higher Education

Dear Neil,

Thank you for the help your team has provided as we have been looking into a query we received about the Department’s Participation Rates in Higher Education release. The enquiry raised concerns about the presentation of the Higher Education Initial Participation Rate (HEIPR) and what this means for how the media and individuals – including this blog published by DfE – interpret the figure.

Having reviewed the release I shared the concerns raised with us about the lack of clarity on what the HEIPR represents and the ease with which it could therefore be misinterpreted.

We have spoken to your team about these concerns and we welcome the improvements they have made to this release to give more clarity about what the statistics do, and do not, represent. We also note that the referenced media blog has now been amended and is worded more appropriately.

Ahead of the next release, we would encourage the team to further consider how they can ensure these statistics are not misinterpreted by users. They may wish to contact the GSS Good Practice Team who may be able to offer some expertise in this area.

Thank you for your cooperation.

Yours sincerely

Ed Humpherson
Director General for Regulation

Response on Department for Education funding statistics

Dear Ed,

DFE USE OF STATISTICS

Thank you for your letter of 30 May. I continue to take the matter of good and proper use of statistics very seriously, and I appreciate the work you and your teams have been undertaking with DfE officials over the last few months.

As you have recognised, we have taken a number of steps over the last few months to ensure statistics are always accurate and used in the appropriate context. And, more broadly, I am committed to ensuring we build the Department’s reputation as a trustworthy communicator of statistics. My Head of Profession for Statistics and his team continue to work closely with policy and communications teams across the Department to raise awareness of these important issues.

The facts around school funding are complex. And the debate around school funding can be impassioned. The Department publishes a wide range of information on school funding and on funding of the wider education system. I agree with you that there is more we can do to bring that wide range of information into one place and to help users navigate this complex landscape, as well as to consider the potential for additional information where this would be helpful. We have recently published a “contents page” for school funding streams on our website and are considering what more we can do in this area. My officials will continue to work with your team as we develop our plans.

I am happy to discuss ways in which the Department can continue to improve its presentation and use of statistics.

I have copied this letter to Neil McIvor, Head of Profession for Statistics, Mike Jones, Deputy Head of Profession, and to John Pullinger, the National Statistician

Yours sincerely,

Jonathan Slater
Permanent Secretary

 

Related Links:

Ed Humpherson to Jonathan Slater (May 2019)

Ed Humpherson to Neil McIvor (May 2019)

Sir David Norgrove to Damian Hinds MP (October 2018)

Department for Education funding statistics

Dear Jonathan,

Department for Education funding statistics

I am writing following our correspondence in the autumn. Since then, I have been encouraged by the steps you and your colleagues have taken to improve your communication of statistics – for example by improving the process which ensures statements released by the Department for Education are cleared by relevant analysts.

We consider, as I know you do too, that there is scope for the Department to improve further.  For a meaningful debate about public spending in any area, it is necessary to have a trustworthy data source. In that context, we note that the Department does not produce a comprehensive set of official statistics on the funding of schools. A wide range of data sources on school funding are currently used to inform debate, by both the Department and others. This in turn can mean that statements using data are hard to verify and replicate, and this creates a risk of undermining the perceived trustworthiness of those making the statements.

We are therefore encouraged that the Department is considering the potential for publishing regular official statistics on school funding, and formally recommend that you do so.

I have today written to the Department’s head of profession for statistics setting this out in more detail.

Your sincerely

Ed Humpherson
Director General for Regulation

 

Related Links:

Ed Humpherson to Neil McIvor (May 2019)

Sir David Norgrove to Damian Hinds MP (October 2018)

 

Department for Education funding statistics

Dear Neil

Department for Education funding statistics

I am writing following our correspondence last year (1)(2). As you know, I welcome the improvements made to enhance the quality and trustworthiness of statements released by the Department for Education, and the ongoing work to improve the value of your statistics, through the development of the Department’s website.

Despite these improvements, we continue to hear concerns from the public about some statements made by the Department and its Ministers.

Your department has strived to improve the technical accuracy of statements made on school funding. However, we have concerns with the presentation of school funding figures. For example, we recently received complaints about the Minister of State for School Standards’ use of school spending figures on the Channel 4 news. It was not clear from the Minister’s statements that he was referring to schools’ budget for 5 to 16 years olds only. While data published by the Institute for Fiscal Studies does support his claims, we noted this data is difficult to find, and requires additional analysis.

I believe it would help support public understanding if the Department were to publish a consistent and comprehensive set of official statistics on school funding, to which all participants in public debate could refer. A separate annex sets out further information on this recommendation.

We have also heard concerns about other statistics used by the Department’s Ministers. For example, we received a complaint about the replicability of music GCSE figures by the Minister of State for School Standards in an oral evidence session in the House of Commons, and another about the use of statistics about the improved outcomes for sponsored academies since 2010 which was based on analysis up until the end of October 2017.

In light of these issues, we have discussed the importance of ensuring that statements are based on accurate, up to date analysis, which can be verified through publicly available data and analysis. I welcome your team’s on-going work to improve transparency of statements in an efficient way. In addition, I would encourage you to focus on not just whether the statements correctly quote the statistics, but also whether, in the context, the use being made of them is liable to mislead.

I have separately written to the Department’s Permanent Secretary, highlighting the importance of the development of a school funding official statistic publication.

I would be happy to discuss these recommendations further.

Your sincerely

Ed Humpherson
Director General for Regulation

 

List of Annexes:

Annex – Official Statistics on School Funding (May 2019)

 

Related Links:

Ed Humpherson to Jonathan Slater (May 2019)

Sir David Norgrove to Damian Hinds MP (October 2018)

National Achievement Rate Tables (NARTs)

Dear Neil

NATIONAL ACHIEVEMENT RATE TABLES (NARTs)

I am writing to you following the FE Week article[1] published on 5th April 2019 – ‘Investigation by statistics watchdog after ESFA lists “unreliable” providers’. The article outlined the reporting of the 2017/18 National Achievement Rate Tables (NARTs)[2] and questioned the decision to include 34 providers with ‘unreliable data’ in the aggregate headline figures.

We have previously written to you regarding these tables in 2017[3] when we undertook an investigation into the revised methodology. As a result, we recommended that you published not only historical national figures but also individual provider figures for previous years to allow for comparison. I am pleased to see that this data has now been published[4] and that similar data will be available for users on an annual basis.

We have investigated the most recent concerns raised by FE Week and noted that the 34 providers redacted in the formal performance tables have been redacted due to quality issues, which are the result of operational, rather than statistical, issues. We are aware that you work with the providers on an individual basis to try to resolve these issues.

Overall, we are content with the publication, and the approach taken to include all data in the aggregate, headline position, particularly given that excluding it risks over-estimating the national position. The publication of transparency tables helps users to understand the incomplete providers’ performance outcomes. However, it would be helpful to give more prominence to this in the headline publication and explain the context around the reasons for incomplete individual learner records. In particular, the fact that these provider outcomes can be very unpredictable from year to year could be emphasised as one of the key limitations of the data.

We also recognise that the work done around the Data Dissemination project[5] may ultimately improve these statistics which would allow users such as FE Week to customise the data.

Your sincerely

Ed Humpherson
Director General for Regulation

 

[1] Investigation by statistics watchdog after ESFA lists ‘unreliable’ providers‘, FE Week, 5 April 2019

[2] National achievement rates tables 2017 to 2018, Department for Education

[3] Letter – Ed Humpherson to Paul Offord (FE Week), 17 July 2017

[4] National Achievement Rates Tables: 2017/18, Department for Education

[5] DfE Data Dissemination Discovery Report, July 2018

 

Compliance check of Phonics Screening Check and Key Stage 1 Assessments in England

Dear Mike

Compliance check of Phonics screening check and key stage 1 assessments in England  

We recently carried out a compliance check against the Code of Practice for Statistics[1] on Phonics screening check and key stage 1 assessments in England[2]. I am pleased to confirm that these statistics can continue to be designated as National Statistics.

We met with your lead statistician, who was very open and engaged. She told us about forthcoming plans for the statistics and also about your organisation-wide project (DfE dissemination discovery project) to present all DfE’s statistics in a more innovative way including a greater focus on interactive maps.

Value:

  • We were encouraged to hear about your plans to improve the statistics to give greater insight around attainment in the phonics check. For example, your plans to introduce time-series back to 2012 for more measures will be particularly valuable for users given that the time-series for other key stage assessments only begins in 2016 due to changes in methodology.
  • Users have shown an interest in attainment in the phonics check by different characteristics over time. Disadvantaged children’s attainment is of interest and it was positive to hear that the team has plans to improve the publication to give more focus on this in the future.
  • We welcome the fact that for the first time since the introduction of the phonics check in 2012, it has been possible to gauge the potential impact of teaching phonics on children as they move through the school by comparing their phonics check results with their key stage 2 reading attainment (KS2). The KS2 assessments release[3] included information on this and, as discussed with the lead statistician, we think that it would also be useful to see this referenced in the phonics statistics.

Quality:

  • Some academies can submit data through a different local authority to the one in which the school is geographically based. This can lead to quality assurance issues in particular when validating data against the appropriate school cohorts as there is then a separation between the local authority responsible for monitoring the phonics checks and the local authority submitting the data. We would encourage statisticians to work better with colleagues across local authorities and within the Standards and Testing Authority (STA) to ensure appropriate quality assurance can be carried out for all data. A good, ongoing working relationship between the teams would also ensure that the impact on the statistics of any other change to the format or running of the check could be considered well in advance.

Trustworthiness:

  • The expected standard threshold for the phonics check is not disclosed to schools in advance. However, it has remained the same since 2012 and the statistics draw attention to this point. The publication includes a graph showing the distribution of marks, which shows a noteworthy change in the distribution at the expected standard every year. Given that these statistics are not published at school level, it is not clear what the incentive would be for schools to unduly influence outcomes. However, the distribution does raise questions about whether knowing the expected standard threshold can impact the results. We were reassured to learn, however, that your statisticians had investigated this phenomenon and did not identify evidence of schools changing their behaviours to influence results. You continue to publish this graph and the expected standard threshold to enable users to understand the full distribution of marks. Education statistics producers elsewhere in the UK face similar issues and there might be value in sharing experiences with them.
  • We are content with your arrangements for regularly reviewing the pre-release access list, and that there is a clear process in place both for the maintenance of the current list and also for requests to be added to the list.

We look forward to hearing more about DfE plans on the changes in the dissemination of its statistics. We will follow up with the statistics team on the detail behind the points in this letter.

I am copying this letter to Rebekah Edgar, Acting Deputy Director, Education Data Division.

Yours sincerely

Mark Pont
Assessment Programme Lead

 

[1] https://osr.statisticsauthority.gov.uk/code-of-practice/

[2] https://www.gov.uk/government/statistics/phonics-screening-check-and-key-stage-1-assessments-england-2017

[3] https://www.gov.uk/government/statistics/key-stage-2-and-multi-academy-trust-performance-2018-revised