Ed Humpherson to Veronica Hawking: NHS waiting lists in England

Updated on 25 January 2024 to correct factual error regarding who publishes official statistics on NHS waiting lists in England: incorrectly listed as DHSC, changed to NHS England

Dear Veronica,

NHS waiting lists in England

Thank you for your letter regarding comments made by the Department for Health and Social Care (DHSC) on NHS waiting lists in England. I am replying as the Head of the Office for Statistics Regulation, on behalf of Sir Robert Chote.

As you note, in an article in the Times on 4 January 2024, a DHSC spokesperson said: “Cutting waiting lists is one of the government’s top five priorities, and despite the impact of strikes, we have cut the total waiting list and the number of individual patients waiting for treatment.”

NHS England publishes monthly official statistics on NHS waiting lists in England. This includes data on ‘incomplete pathways’, which are waiting times for patients still waiting to start treatment at the end of the month, and the number of unique patients on NHS waiting lists.

At the time of speaking, the available official statistics on NHS waiting lists in England showed the total number of incomplete pathways and unique patients on NHS waiting lists fell between September and October 2023. The latest data published on 11 January show that the total number of incomplete pathways and unique patients on NHS waiting lists also fell in November 2023.

It would better support understanding if DHSC had been clear on what time period it was referring to avoid potentially misleading people. This is particularly important where the fuller time series of waiting lists data, as shown in the chart below, show an upward trend in waiting lists over the last year and could therefore be perceived as selectivity of data.


Consultant-led Referral to Treatment Waiting Times Data 2023-24, November 2023

The chart above shows the estimated number of incomplete pathways and the estimated number of unique patients waiting for consultant-led treatment with the NHS in England, from November 2022 to November 2023. The lines in the chart show a steady increase in both from November 2022 until reaching a peak in September 2023, and then falling.

Those speaking on behalf of Government should make it easy for people to understand and verify what data it uses to inform the public. Taking a transparent approach by default ensures confidence in public messaging is not undermined.

Yours sincerely,

Ed Humpherson

Veronica Hawking to Sir Robert Chote: NHS waiting lists in England

Dear Sir Robert,

I am writing on behalf of 38 Degrees and our supporters to raise concerns about a potential misrepresentation of NHS waiting list data by the Department of Health and Social Care.

In a statement in today’s Times, a spokesperson for the Department of Health and Social Care said “cutting waiting lists is one of the government’s top five priorities, and despite the impact of strikes, we have cut the total waiting list and the number of individual patients waiting for treatment.” [1]

This statement is in response to an open letter published a year to the day after the Prime Minister made getting waiting times down one of his five priorities. At the time when the Prime Minister made that promise, 6.06 million patients were waiting for 7.20 million procedures. [2] In October 2023 – the latest data available – 6.44 million patients were waiting for 7.71 million procedures. [3]

We believe this spokesperson may be referring to the fact that between September and October 2023, the number of patients fell from 6.50 million to 6.44 million and the number of procedures fell from 7.77 million to 7.71 million. [4] However, we believe it is misrepresentative to claim that the government has cut waiting lists, especially considering that these figures are still higher than they were when the Prime Minister made his promise that they would fall, on 4 January 2023.

We raise this matter in light of other concerns raised regarding the accuracy of government statements, including the claim that ministers have “cleared” the asylum backlog, which you are currently investigating. [5]

We therefore ask that you investigate the Department of Health and Social Care’s statement to The Times and to offer your guidance on whether they are misrepresentative .

We look forward to your response on this matter.

Yours sincerely,


Veronica Hawking,



[1] The Times, Thursday 4, January 2024, Page 6, “Celebrities urge PM to cut wait lists”
[2] NHS England: RTT Overview Timeseries Including Estimates for Missing Trusts Oct23
[3] Ibid
[4] Ibid
[5] The Standard: Statistics watchdog to scrutinise Sunak’s claim the asylum backlog has been
‘cleared’, Wednesday 3, January 2024


Ed Humpherson to Tony Dent: Household Costs Indices

Dear Tony,

Household Costs Indices

Thank you for your letter dated 14th December 2023, relating to the publication of the Household Costs Indices (HCIs), and more widely relating to the UK’s suite of inflation statistics. I hope you and your colleagues had a good festive break and I wish you all a Happy New Year for 2024.

It is good to hear of strong engagement with these statistics and that the new HCIs received a warm welcome at the Royal Statistical Society/Resolution Foundation event on 12th December.

You highlighted the demand for HCI statistics and asked when National Statistics status may be considered. ONS faces competing priorities and user need following delivering the first quarterly release of these statistics. This includes whether the new HCIs should be published monthly, the timing of the publication and when the statistics should be put forward for formal accreditation (previously referred to as National Statistics badging). My team is having ongoing discussions with the ONS Prices team around these developments, the competing priorities and suitable timing for ONS to seek accreditation for these statistics. We will ensure that we seek your views once we start the assessment to consider accreditation of the HCIs.

In response to your concerns that CPIH is compromised by inadequacies, we would note that there is ongoing debate about the optimal methodology for capturing housing costs, and ongoing work within the ONS to improve the data that is received to capture these costs as fully as possible, both for rents and for owner-occupied housing. From a regulatory perspective, we are aware that the ONS is keen to seek accredited official statistics status for the new private rental series and, by default, the continued accredited statistics status of CPIH. With this in mind, alongside the ongoing debate about how best to capture housing costs, we plan to commence an assessment of the new Price Index of Private Rents and the owner-occupiers housing element of the CPIH in the spring. We will seek your views as part of this assessment.

We are pleased to see the improvements that ONS is making to these statistics and believe that ONS was transparent in publishing the impact analysis of including the new private rental series around the limitations of the previous data and methods, which were the best available data at the time. The latest rents data are more comprehensive and detailed than the historic data and that, combined with methodology changes, is enabling more robust estimates to be produced around rents.

Thank you again for your correspondence. We look forward to further discussions around UK official statistics in 2024, both relating to inflation and more generally.

Yours sincerely

Ed Humpherson

Ed Humpherson to Rob England: Withdrawn Asylum Applications 

Dear Mr England 

Withdrawn Asylum Applications

Thank you for contacting us. We have looked into your concern.   

The Home Office has confirmed that the total number of withdrawn asylum cases is correct but recognises that there may be an error where withdrawn asylum applications may not have been correctly classified into the two potential sub-classifications of withdrawal. The Home Office has included a note about this alongside the statistics while it continues to investigate the extent of the impact. While we recognise that this potential data quality issue may be disappointing to some users, and the Home Office could have made this issue more prominent for users, we are satisfied that the Home Office has plans in place to ensure this is resolved for future publications.   

We would also like to clarify the definitions of the two types of withdrawn asylum applications. An application is described as a ‘non-substantiated withdrawal’ if the applicant fails to cooperate with the process to examine and decide the asylum claim within a reasonable time period. This is the group referred to in the BBC article you shared with us and discussed at the Home Affairs Select Committee, and which has been described as ‘missing’. By contrast, where the applicant actively chooses to withdraw the application, this is classified as ‘other withdrawals’.  

The 17,316 figure is the total of both types of withdrawn asylum applications. This figure therefore includes those classified as other withdrawals. As these applications are withdrawn by the applicant, it would be incorrect to describe these applicants as missing because once their application has been withdrawn, they are expected to either leave the country or have an alternative agreement to stay. The error within the data, therefore, means that it is not at present publicly reported how many applicants would fall into the category described as ‘missing’ – that is, the number of applications that are classified as non-substantiated withdrawals.  

We will continue to engage with the Home Office to ensure this issue is resolved as soon as possible. 

Yours sincerely   

Ed Humpherson
Director General for Regulation 

Tony Dent to Ed Humpherson: Household Costs Indices

Dear Ed,

Following the publication of the quarterly HCI’s on 5th December, I attended Tuesday’s informative discussion on those measures, as hosted by the Royal Statistical Society in collaboration with the Resolution Foundation; where the event was held. Almost universally the HCI’s were welcomed by the various speakers, including Paul Lewis, the journalist and broadcaster and Morgan Wild of Citizen’s Advice.

There is an evident demand for these statistics to be published monthly alongside the CPI, rather than quarterly as currently proposed. There is also interest in when they may be considered for National Statistics status. Although such issues are not under your direct control, Better Statistics are interested to understand if there are any discussions in progress as to when such status might be considered? In our opinion the HCI’s are a clear example of the better statistics we seek!

The position is somewhat compromised by the evident inadequacies of the CPIH, which is presently subject to considerable adjustment to correct for inaccurate rents data. This latter has served to re-ignite the controversy created by selecting ‘rental equivalence’ to measure inflation as experienced by owner occupiers. As evidenced by the recent errors, replacing the robust calculation of the costs actually experienced by owner occupiers with the less than robust rental costs of a fabricated ‘measure’ is not appropriate. I am sure you are aware that the recent amendments to CPIH have created widespread comment, for example, the Twitter handle @TUCeconomics recently posted “ONS ‘what if’ calculations suggest CPIH might have been 0.5 percentage points higher in recent months.” Further comments on this matter were recently posted by Shaun Richards on the Stats User Net – please see The ONS View on owner-occupied housing is in disarray.

We therefore echo Shaun Richards’ comment at the end of his post that, surely, the National Statistics status must now be removed from the CPIH? Do you have plans to reconsider that status?

In closing, I wish you and all at the OSR a Merry Christmas and Happy New Year. Whatever may happen as a result of the UKSA review, I hope we might have the opportunity for further discussion on the Code of Practice in the New Year. That is the key to ensuring badly needed improvement across all of our statistics.

With all good wishes,


Tony Dent,


Related Links:

Ed Humpherson to Tony Dent: Household Costs Indices (January 2024)

Maya Forstater to Ed Humpherson: ONS research into 2021 England and Wales Census data on Gender Identity

Dear Mr Humpherson

The ONS has released the final report on its investigation of the quality of the 2021 census’s gender-identity data together with a statement saying it has “confidence in the gender identity estimates at national level”. The report offers the following conclusions in support of this.

  • On collecting and processing the data: “This strand of our research … provided no evidence that the design of the question or the statistical processing of the collected data had an adverse effect on the quality of the published statistics”.
  • On respondent error and misunderstanding: “There are patterns in the data that are consistent with some respondents not interpreting the question as we had intended… However, for the reasons outlined in this report, we cannot say whether the census estimates are likely to be an overestimate or an underestimate of the true value, given other sources of uncertainty, not least the potential impact of question non-response. Therefore, the overall impact on the data of any misinterpretation of the question cannot be determined.”

We find that the ONS’s investigation has been inadequate and its conclusions are not supported for five reasons.

  1. The investigation does not address the core issue concerning consistency or not between an individual’s answers to the sex question and gender-identity question.
  2. Implausible explanations are accepted ahead of more likely interpretations of misunderstanding for answering NO to the question about whether they match.
  3. Given the small numbers of transgender people and the likely level of misunderstanding the statement that the ONS has “confidence in the gender identity estimates at national level” is not supported.
  4. The classification of people by the terms “trans man” and “trans woman” by the ONS is not in line with the Census (England and Wales) Order 2020.
  5. The design of the question relies on the whole population making a declaration based on the idea of gender identity in order to estimate a minority that identify as transgender.

Furthermore the focus in the ONS statement on the numbers being plausible at the national level implies (but does not make explicit) that it is not confident they are plausible at the subnational level.

Attached to this letter are further details on each of these concerns.

Our conclusion is that data on gender identity (including the sex of the people identified as transgender) is not fit for purpose. It was driven by the adoption of concepts and questions promoted by lobby groups that seek to replace sex with gender identity.

The lack of clarity and certainty about the meaning and interpretation of the sex question and the gender identity question also has knock-on effects for interpreting how the sexual orientation question, the sex and the gender identity question interact.

This investigation by the ONS is inadequate and undermines confidence in national statistics. It lays the groundwork for further erosion of clarity on sex, and the wider adoption of a gender identity question, and use of associated data, that has been demonstrated to be unreliable.

The ONS has proved itself unwilling to accept clear indications that the gender-identity question produced unreliable answers, and has adopted a “self-identified” approach to the definition of sex even after the High Court judgment in 2021. We therefore have no confidence that the ONS will adequately address the need for clarity and accuracy about sex and transgender identity in its development of the harmonised questions on sex and gender identity.

We call on the Office for Statistics Regulation to take regulatory action in order to secure public confidence in national statistics, and to prevent the faulty question being replicated.

  • The question should be officially retired, with an apology and an explanation to discourage others from using the same wording.
  • A warning should be put on the data and it should not be designated as national statistics. The OSR must determine whether the national headline figure and the sub-national figures should have national statistics designation.
  • The ONS definition of “sex” for the 2021 census should be corrected to reflect the guidance given to respondents following the FPFW challenge, and the question of whether actual sex should be routinely and clearly collected for population demographics should be reviewed.

The ONS should also:

  • publish the free text answers broken down by sex
  • publish details on the number of people in the telephone survey who did not confirm their answers to the gender-identity question
  • analyse the pattern of households which they classify as having more than one trans member (which may indicate a misunderstanding of the question)
  • investigate whether it is possible to cross-check sex and self-identified gender for this section of the population with other data sources such as administrative data from the NHS and the birth and gender reassignment registers.

We would like to know whether the response from the ONS was seen or agreed by the OSR before the ONS sent it and whether it has been discussed by the UKSA board. Is the ONS report the view of the National Statistician? Has it been seen by the Methodological Assurance Review Panel or any other methodological group?

We are publishing this letter and we request that it and any response are also published on the UKSA website and linked to in a way that makes it findable to users of the gender-identity data.

Yours sincerely

Maya Forstater
Executive Director

Helen Joyce
Director of Advocacy

Michael Biggs
Advisory group member


Related Link:

Ed Humpherson to Maya Forstater: ONS research into 2021 England and Wales Census data on Gender Identity

Ed Humpherson to Maya Forstater: ONS research into 2021 England and Wales Census data on Gender Identity

Dear Maya

ONS research into 2021 England and Wales Census data on Gender Identity

Thank you for your letter regarding the research on the quality of gender identity data collected in the 2021 England and Wales Census published by the Office for National Statistics (ONS) and for setting out your views of this research.

As you know, the Office for Statistics Regulation (OSR) is carrying out a review of these statistics and we published an interim report in October 2023. Our interim report makes clear that we plan to review and engage with users on ONS’s research before publishing a final report. We will consider the points set out in your letter and look forward to engaging with you to inform our final report.

You asked whether ONS’s research was seen or agreed by OSR before it was published. It would not be appropriate for OSR to agree or approve research carried out by statistics producers, particularly research which forms part of a live review. ONS did share a late version of the summary report with us, and we responded by confirming our expectations that ONS should reach a clear view of the performance of the Census question. However, we did not approve or sign-off the final version of the ONS report in any way. Our forthcoming report will address to what extent we consider the published research meets our expectations in line with the Code of Practice for Statistics.

More information about our work in this space can be found on our website.

Yours sincerely

Ed Humpherson
Director General for Regulation

Ed Humpherson to Stephanie Howarth: Welsh Government, 20 mph speed limit

Dear Stephanie,

Improving transparency of Welsh Government’s 20-mph speed limit data

I am writing today to outline my concerns with the transparency of sources supporting claims on the implementation of the Welsh Government’s 20 mph speed limit, particularly in a promotional leaflet sent to households in Wales.

While the claims in the promotional material cited the sources for the figures quoted, I consider that improvements could have been made to aid transparency and better support users.

The source used for the claim that ‘most journeys will be around one minute longer’ was from an Explanatory Memorandum to the Restricted Roads (20 mph Speed Limit) (Wales) Order 2022 from June 2022. This document was a Regulatory Impact Assessment and sets out the likely economic costs and benefits for the then proposed legislation as well as the rationale for the proposal, objectives of the policy, and details how the policy will be monitored. The document itself did include some high level information on how the increase in overall journey times was derived and the uncertainty and limitations of the estimate was acknowledged. I consider, however, that it would be challenging for a reader to unpick this detailed document to find and understand the data and calculations used to support this claim, and that it would not be reasonable to expect people who read the leaflet to do so.

In the course of engaging with your team on this matter, they helpfully shared a technical note on the journey time calculations within the Regulatory Impact Assessment. This more clearly sets how the claim was derived in a way that would allow scrutiny from readers, but this technical note was not publicly available for citizens to access.

In short,  I do not consider that the presentation of this analysis in the leaflet fully aligns with our expectations for Intelligent Transparency.

I do recognise that other, more detailed data are available for people to understand the impact on journey time through the monitoring report from phase 1 of the 20mph trial and that Transport for Wales will be developing its plans for future reporting including an interim monitoring report expected in June 2024. I also understand you now intend to publish the technical note which I welcome.

To better support users in future, I recommend that:

  • As part of planned monitoring, the Welsh Government and Transport for Wales should continue to review the evidence on journey times and cite the most appropriate evidence source when communicating the impact on journey times.
  • Analyst teams in the Welsh Government should continue to support and work with communications and policy colleagues in the preparation and presentation of data and statistics.
  • Transport for Wales, in conjunction with Welsh Government, should consider how it can make improvements to data and statistics on the implementation of the 20mph speed limit, including considering the accessibility of data and supporting information. Transport for Wales should seek feedback from users as it starts develops its new outputs.

Thank you for engaging with us on this matter.

I am copying this letter to Geoff Ogden, Transport for Wales.

Yours sincerely,

Ed Humpherson
Director General for Regulation

Ed Humpherson to Philipp Tanzer: Sex difference in online harassment

Dear Philipp,

Differences in online harassment by sex

Thank you for contacting us with your concerns about the claim that “women are 27 times more likely than men to be harassed online”. As you pointed out, this statistic has been used several times in the House of Commons and House of Lords in discussions about the Online Safety Bill.

The claim is based on an external research report rather than official statistics. While this falls outside our formal remit, we have investigated the claim because, as a general principle, we consider that high profile numerical statements should be supported by sound evidence and clearly identified sources.

The original source of the statistic is a 2015 UN Broadband Commission report, Cyber Violence Against Women and Girls: A Worldwide Wake-Up Call. However, the source most commonly cited is a 2017 report and resource pack, Her Net Her Rights, produced by the European Women’s Lobby, which references the UN report.

We were not able to properly scrutinise the statistic because it is our understanding that the UN report was withdrawn by the UN Broadband Commission shortly after publication, due to concerns about the quality of the analysis in the report. The UN Broadband Commission replaced the report with a short executive summary with the key findings of the research, but this does not contain the ‘27 times’ claim.

Sex and gender based differences in online harassment are well-researched topics. A range of other data sources are available which can be scrutinised. In addition to the three research studies that you highlighted, we found two studies that are more specific to the UK context: Ofcom’s annual Online Nation report, which contains the results of its Online Experiences Tracker survey, based on a nationally representative sample; and the Victim Commissioner’s 2022 The Impact of Online Abuse: Hearing the Victims’ Voice report, which contains the results of a survey commissioned to contribute to the development of the Online Safety Bill, based on a self-selecting sample.

Yours sincerely,

Ed Humpherson
Director General for Regulation

Ed Humpherson to Max Hill CPS Statistics

Dear Max

Statistics on Rape and Sexual Offences

At the Office for Statistics Regulation, our role is to investigate any concerns about the quality, good practice, and comprehensiveness of official statistics. While the Crown Prosecution Service (CPS) is not an official statistics producer, it publishes quarterly bulletins of data tables and summaries of main trends as part of its ongoing commitment to transparency on prosecution performance.

We have been contacted about CPS’s statistics on Rape and Sexual Offences in its quarterly data summaries. The complainant highlighted three issues regarding how the data are collected, presented and explained by CPS.

The first issue is that it is not clear in the main section of the bulletin exactly what is being counted when it refers to rape cases, with this only being explained towards the end of the publication. The caveats and limitations of the data are not prominent which does not support appropriate use of the data. To enhance clarity, CPS could include a definition above the table that breaks down the rape data and outcomes, that explains that the figures are referring to rape flagged cases.

The second issue is that CPS is publishing management information on conviction rates that is different from the conviction rates data published by the Ministry of Justice (MoJ). The discrepancy is due to a difference in methodologies used. MoJ counts the number of prosecutions by offence/principal offence, whereas CPS counts the number cases with a rape flag. To support understanding, we would encourage CPS to make the statement that explains that these data are sourced from management information, and signposts readers to official statistics on criminal justice outcomes published by MoJ and the Home Office, more prominent in the quarterly releases.

The third issue raised with us is that CPS’s case flagging system may lead to inflated figures for the number or charges and conviction data for rape. This is because where a defendant is charged with rape and sexual assault, but then the rape charges are dropped or the defendant is only convicted of sexual assault, they are still included in the charges and conviction data for rape.

Analysts in CPS informed us that the system was not designed to capture this information but recognised it was a limitation of the data. The purpose of the flagging system is to track the whole lifecycle of cases so that the victim in cases such as rape or other flagged cases are fully supported throughout. This methodological limitation is crucial for interpreting the end data and must be made clearer to users, through a prominent caveat alongside the data.

It is never our aim to inhibit transparency and we welcome the fact that CPS is putting these data into the public domain to inform public debate. However, the distinction between official statistics and other data may seem artificial to many users of these data. Releasing data without sufficient context and clarity may limit the extent to which they can usefully inform public debate and does not meet our expectations for intelligent transparency.

We therefore strongly recommend that CPS takes steps to voluntarily apply the Code of Practice for Statistics. This will demonstrate to users that CPS seek to commit itself to the highest principles and practices of trustworthiness, quality and value embedded in the Code.


Yours sincerely


Ed Humpherson
Director General for Regulation