Ed Humpherson to Maya Esslemont and Anna Powell-Smith: Modern slavery data

Dear Maya and Anna,

Modern slavery data

Thank you for your letter dated 14 June and sharing the new report published by After Exploitation titled ‘A can of worms: Challenges and opportunities in gathering modern slavery evidence’. Your letter and report highlighted concerns raised by practitioners regarding the National Referral Mechanism (NRM) Statistics and the wider state of modern slavery data.

We are grateful for you taking the time to meet with us to discuss the findings of this report. In the interests of transparency, we have set out the steps we have taken in response to your concerns.

Statisticians in the Home Office are working to address and enhance the quality and value points raised in our compliance check of the NRM and Duty to Notify statistics and we will continue to support them to do this. To inform this work, we have shared your report and highlighted the feedback about what practitioners want from the NRM statistics. We have also passed on your contact details so that you are included in their wider user engagement aimed at improving the value of the NRM statistics.

We recognise that you have previously raised concerns about the Home Office’s use and release of data and statistics on modern slavery. We have encouraged statisticians in the Home Office to engage closely with policy and communications colleagues to ensure that public statements are clear on whether they are sourced from official statistics or other forms of evidence.

We will continue to monitor the use of NRM statistics in public discourse and will be willing to intervene where the statistics have been used in a way that undermines confidence in the official statistics.

Yours sincerely,

Ed Humpherson


Related link:

December 2022 – Ed Humpherson to Maya Esslemont and Anna Powell-Smith: Modern slavery data

Ed Humpherson to Jonathan Marron and Jeanelle de Gruchy: ‘The economic and social cost of harms associated with gambling in England’ (2023)

Dear Jonathan and Jeanelle,

OSR response to the OHID report ‘The economic and social cost of harms associated with gambling in England’ (2023).

I am writing to you following our recent engagement regarding a case raised with the Office for Statistics Regulation (OSR) about the 2023 report ‘The economic and social cost of harms associated with gambling in England’ by the Office for Health Improvement and Disparities (OHID). This report updates an earlier version published by Public Health England (PHE) in 2021.

As the regulator for official statistics, our role is to ensure that official statistics are produced with trustworthiness, quality, and value, in accordance with the Code of Practice for Statistics. While the report contains some official statistics, we recognise that the research and data it presents do not constitute official statistics. Therefore, our findings and recommendations are provided on an informal, advisory basis, focusing on the importance of the clarity and transparency of this information.

We are grateful for the time taken by your team to meet with us to discuss the concerns raised and opportunities to enhance clarity and understanding in future reports. In the interest of transparency, we have set out the recommendations we made to OHID in this letter and the actions your team have committed to take in response to our findings.

Given the importance and sensitivity of this subject, which has seen limited research, we commend the efforts made in producing this original report. It offers valuable insights into the economic and social costs of gambling-related harms, significantly contributes to addressing gaps in data and knowledge and reflects OHID’s commitment to advancing understanding in this complex field. We also recognise the report’s transparency in outlining methodological decisions and limitations. Given the lack of available data on gambling-related suicides at the time of publication, OHID’s engagement with experts and inclusion of a range of estimates in the 2023 update demonstrates a thoughtful approach to handling uncertainty.

For the value of the data to be enhanced, we recommended additional clarity around the strength of the evidence used to estimate the costs. For example, including more detailed information in Table 1 (page 7) about the number of studies supporting each sub-domain’s estimate, and indicating the level of confidence in these studies, would help users better assess and understand the data. We appreciate that OHID has been receptive to this feedback and has agreed to make these improvements in future updates.

We also recommended that OHID is more mindful of the language used in subsequent updates. Clear and careful phrasing will help ensure that the findings are communicated sensitively. With these adjustments, we believe the report will provide even greater value to its readers.

We appreciate that you have been open to challenge and have shown a commitment to assessing and updating this report; revisiting the data to ensure the robustness of the methodology and calculations, which reflects a commendable approach to enhancing public understanding and the value of this research.

In this spirit, we note that you were approached by a member of an interest group in September 2023 and committed to addressing their feedback. However, a response has not yet been provided. We would encourage you to maintain transparency and timeliness in your communications, as this is essential for building trust and demonstrating your commitment to ongoing dialogue and improvement. Promptly addressing this matter will further enhance the credibility of your work and will prevent any doubt being cast on the actions you are taking.

We recognise your efforts to use statistics with full context and clarity to bolster public trust, and we are eager to assist you in furthering this commitment through the implementation of our recommendations.

 

Yours sincerely,

Ed Humpherson

Jeremy Balfour MSP to Office for Statistics Regulation: Transport Scotland and Sustrans Walking and Cycling Index

Dear OSR,

Transport Scotland and Sustrans Walking & Cycling Index

For a few years constituents have raised concerns with me about Sustrans Walking and Cycling Index (formerly known as Bike Life). In Scotland, the city reports and aggregated report for the whole of Scotland seem to be partly or fully funded by Transport Scotland.

A constituent has written to me saying they have raised some of their own concerns about the Index directly with the OSR. The constituent has provided positive feedback about the promptness and helpfulness of OSR’s response and is reassured that the OSR is informally looking into the Walking and Cycling Index.

However, the issues are so significant that I feel it necessary to add further concerns and follow up directly with OSR.

There are multiple concerns and as examples, three of the themes that keep coming up are:

    1. Data presentation: For example, the use of text and graphic design to present comparable stats for levels of walking and cycling in a manner that creates the impression that levels of cycling are proportionately greater than they are, which can misinform funding priorities.
    2. Omission of balancing data: For example, modelling is used to quantify the health benefits of cycling with preventing serious long-term health conditions, and early deaths, and then quantifying the saving to the NHS. However, this is not balanced by official data of actual injuries and fatalities from cycling and then modelling the related cost to the NHS. To pick one example, government figures show 255 people were killed or seriously injured while cycling due to “poor or defective road surface” between 2017 and 2022,  and this has happened recently in Lothians constituency. It also doesn’t consider the health implications of family and professional carers finding it harder to visit vulnerable people due to new traffic restrictions. By not including this balancing data, it misinforms policy makers and elected representatives on how best to allocate budgets to maximise health benefits and minimise negative impacts. In Edinburgh for example, spend on new cycle lanes has dwarfed spend on surface maintenance, and in the foreword to the Edinburgh Index, the Transport and Environment Convener uses the report as a platform to justify more of the same approach.
    3. Selective use of questions and case studies: This has the impact of creating misleading narrative about impacts for disabled people. Active travel is important for some people living with disabilities and more needs to be done with improving accessibility of pavements and crossings etc. However in Scotland, Sustrans is in an unusually strong position with the ability to control funding to councils for active travel schemes. Sustrans has been central to allocating funds for schemes that even if they sometimes make life easier in some locations for some disabled people, they often do this in a way that reduces car access and parking to a point that has serious negative impacts for others, including carers. This research does not include questions that allow disabled people to voice their experiences on these negative impacts, and it appears that as with all other case studies, case studies featuring disabled people generally reinforce Sustrans preferred narrative around active travel. A constituent has shared examples of active travel campaign groups being invited to submit case studies and participate in photos shoots for the report and Sustrans staff or prominent activists being featured.

The data from these reports is used in papers and consultations in a way that embeds the statistics within policy making and decisions to allocate nine-figure budgets in Scotland. Various organisations in the disability sector have raised concerns about projects funded by these budgets, including at COP 26 in Scotland where the issues of “ecoableism” were covered in relation to some active travel schemes having negative impacts on disabled people.

Sustrans is a registered lobbyist with the Scottish Parliament, and records show that over 100 lobbying events have been recorded over the last 6 years, therefore it seems inappropriate that Transport Scotland would be funding such high profile research through a lobby group.

I have also just learned from a constituent that Sustrans is included in legislation as a provider of Official Statistics to the Scottish Government which magnifies concern.

The Code of Practice for Statistics sets the standards that producers of Official Statistics should give confidence that published government statistics have public value, are high quality, and are produced by people and organisations that are trustworthy.

Therefore, I would be grateful to have further information on:

  1. What action the OSR is taking with Transport Scotland and Sustrans around the Walking and Cycling Index 2023 which was launched a couple of months ago and will be regularly used as a reference by policy makers for the next two years?
  2. What action will be taken for the next wave of research in 2025 for publication in 2026?
  3. What the OSR’s position is on registered active lobby groups also being providers of Official Statistics and whether this aligns with the Code of Practice aim that organisations producing Official Statistics are viewed as trustworthy?
  4. Is this situation of a lobby group also being a provider of Official Statistics unique to Scotland?

I would welcome your response.

Kind regards,

Jeremy R Balfour MSP

Ed Humpherson to Mr Jeremy Balfour MSP: Sustrans’ Walking and Cycling Index

Dear Mr Jeremy Balfour MSP

Thank you for your correspondence regarding the Walking and Cycling Index produced by Sustrans. You note concerns have been raised with you by a constituent who has also raised their concerns directly with the Office for Statistics Regulation.

Below I hope to address the questions you posed.

Is this situation of a lobby group also being a provider of Official Statistics unique to Scotland?

Sustrans is a registered charity and one of a few charities that appear on the Official Statistics Order.

Organisations that have their specific business or policy agendas can be included on the Order – other examples include commercial organisations such as the Student Loans Company and Skills for Care, and other interest groups such as Meat Promotion Wales and Skills Development Scotland. We recognise that many charities undertake lobbying activities as part of their work, but do not consider that their organisational objectives (to promote a particular policy perspective) mean that their statistics cannot be produced in line with the Code of Practice for Statistics. It is compliance with the Code which matters, to which I now turn.

What the OSR’s position is on registered active lobby groups also being providers of Official Statistics and whether this aligns with the Code of Practice aim that organisations producing Official Statistics are viewed as trustworthy?

We expect all organisations who produce official statistics to act in a way which demonstrates their trustworthiness in line with the Trustworthiness pillar of the Code. At an organisational level, this requires that the organisation act with integrity in the handling of data and is transparent about its plans and decision making in relation to its statistics. Complying with this pillar of the Code will ensure that there is a clear distinction between the professional autonomy and judgement of the statistical professionals in the organisation, and the organisation’s overall purpose and interests. This distinction between professional judgement and overall purpose is particularly important for Government statistics, where statisticians work in organisations headed by Ministers, who have clear and stated policy objectives. Statisticians must be able to produce data and statistics free from the particular interests of the Government of the day.

In this sense, our view is that compliance with the Code should insulate the statistical outputs from the wider interests of the organisation.

In addition, being on the Order enables us as the regulator to take greater action to ensure official statistics are produced in line with the Code. We would not have the same leverage otherwise.

For all these reasons, OSR does not have a position on the type of organisation that can, or cannot, produce official statistics.

What action the OSR is taking with Transport Scotland and Sustrans around the Walking and Cycling Index 2023 which was launched a couple of months ago and will be regularly used as a reference by policy makers for the next two years?

The crucial question, then, is compliance with the Code of Practice.

Sustrans is listed as a producer of official statistics. However, only statistics produced by Sustrans on behalf of the Scottish Ministers are specified as official statistics. The Walking and Cycling Index therefore does not constitute official statistics but is published as a statistical publication and is a high-profile output intended to inform the public. To support the public good we believe that all statistics benefit from being produced in line with our Code of Practice for Statistics and we consider that while Sustrans does follow good practice in many areas, there are improvements which could be made to Sustrans’s statistical outputs.

As with all statistical outputs, hearing feedback from users of the data is very important as part of producer organisations making improvements to the statistics. We have raised the concerns regarding the presentation of data with Sustrans which has agreed to act on this feedback in future publications. Regarding the survey design, Sustrans works with methodologists in NatCen (the survey supplier) when developing or revising survey questions, with questionnaires being finalised after consultation with survey suppliers and the participating city partners. We consider that Sustrans’ approach to question design could be more fully documented and made accessible alongside the reports so any user of the statistics could better understand their processes and decision making.

For impact data, which are derived from models, Sustrans provides information on the models used in its Data sources and methodologies document which signposts to further details, including to a user guide on the Sport England MOVES tool used to estimate savings to the NHS. There are opportunities for Sustrans to enhance the methodology information by setting out why it has chosen the methods it has and by including any caveats directly within the report.

We have shared these identified improvements with Sustrans and Transport Scotland. We have also recommended Transport Scotland should have greater involvement in the production of these statistics going forward to help support better understanding and application of the Code principles.

What action will be taken for the next wave of research in 2025 for publication in 2026?

I cannot be specific in advance on what action, if any, might be taken for the next wave of research in 2025. However, we have recommended to Sustrans that they publish a development plan so this will be available to users of the research.

We will continue to engage with Sustrans and Transport Scotland over the coming months on its plans for future reporting.

If you would like further clarity on OSR’s position regarding official statistics production and our remit as the statistics regulator, do please contact my private office.

I am copying this letter to Sustrans and Transport Scotland to place our analysis and recommendations formally on the record.

Yours sincerely

Ed Humpherson

Sir Robert Chote to Rt Hon Richard Holden: Party spending claims

This letter was sent from Sir Robert Chote, Chair of the UK Statistics Authority, and is also available on the Authority website.

Dear Mr Holden,

Thank you for your letter of 6 June regarding the Labour Party’s analysis of Conservative Party commitments and its own plans.

In my recent letter to political parties, I asked that parties and candidates use statistics appropriately and transparently during this general election campaign and made clear why these expectations are in the interests of the public and those campaigning. These expectations were echoed in a statement published by the Office for Statistics Regulation regarding the claim that “a Labour government would mean £2,000 of tax rises per working household”. Many of the principles set out in that statement apply also to the claim you highlight, that the Conservatives have “£71 billion of unfunded spending plans”.

This figure derives from Labour’s 25 May document Conservatives’ Interest Rate Rise which sets out their costings of nine future ‘policy decisions’ and refers to roughly £71 billion of net extra spending in fiscal year 2029-30. In another document, Tory Manifesto Costings published on 13 June, the Labour Party claimed that Conservative manifesto plans would amount to net extra spending of roughly £71 billion over the next five fiscal years put together and “raise people’s mortgages by £4,800” cumulatively over that period.

Future costings are always subject to uncertainty and dependent on choice of methodology. To help people understand the assumptions that have gone into costing models, it is essential that the underlying calculations, data sources and context are provided alongside the figures. When distilling these claims into a single number, there should be enough context to allow the average person to understand what it means and how significant it is. Omitting this information can damage trust in the data and the claims that these data inform.

To safeguard trust in official statistics, we encourage that statistical claims are presented clearly and transparently so that the public can test the arguments, and descriptive statements, that political candidates make about them.

Yours sincerely,
Sir Robert Chote

NEU to Ed Humpherson: Comparisons of the proportion of good and outstanding schools over time

Dear Mr Humpherson,

I am writing because I am concerned about the misuse of Ofsted statistics by the Department for Education and by the Secretary of State for Education Gillian Keegan, both in a press release and on social media.

In a press release titled “First step towards introducing the Advanced British Standard” and in a post on Gillian Keegan’s account on X (formerly Twitter), the DfE and Gillian Keegan claimed “89% of schools being rated good or outstanding by Ofsted, up from just 68% in 2010” and “89% of schools are now rated good or outstanding, up from just 68% under Labour”, respectively.

This statistic is misleading, as it suggests to the reader that school level Ofsted ratings are comparable between these two time periods.  Due to changes in the school inspection framework, there is a substantial difference in the meaning of a ‘good or outstanding’ school in this period.  Suggesting that these statistics are comparable is a violation of several of the principles of the Code of Practice for statistics, including T3.8, which requires that policy, press, and ministerial statements meet basic professional standards of statistical presentation, and several points of the quality and value pillars, including Q1.4, Q1.5, Q1.7, Q3.3, and V3.2, which concern the comparability of statistics.

The statistics which these statements refer to were reported by Ofsted in the annual reports for 2010 and 2023 and refer to the proportion of schools rated good or outstanding at their last inspection as of August 31st of each year.  Between 2010 and 2023 there have been several changes to the inspection framework; the most notable was in September 2012 under the coalition government.  This introduced the current system of judgements where schools are rated from 1 to 4, with the grades corresponding to ‘Outstanding’, ‘Good’, ‘Requires Improvement’, and ‘Inadequate’.  Prior to this change, the grades corresponded to ‘Outstanding’, ‘Good’, ‘Satisfactory’, and ‘Inadequate’.

This change led to a redefinition of the second and third categories; while the second category retained the ‘Good’ descriptor, the actual criteria to meet this category changed due to the reclassification of ‘Satisfactory’ to ‘Requires Improvement’.

The impact of these changes can be demonstrated by comparing Ofsted grade descriptions for inspections from January 2012 to those for inspections from September 2012.  Prior to the framework change, the 3 (satisfactory) grade descriptor for overall effectiveness included details on several areas of school performance and was similar in length to the descriptors for good and outstanding schools.  It read, in part: “pupils and groups of pupils have a generally positive experience at school and are not disadvantaged as they move on to the next stage of their education, training, or employment”.

The September 2012 description of a grade 3 “requires improvement” school was simply: “The school requires improvement because one or more of the four key judgements requires improvement (grade 3) and/or there are weaknesses in the overall provision for the pupils’ spiritual, moral, social, and cultural development”.  Within the individual key judgements, the ‘requires improvement’ grade description is that a given area “requires improvement as it is not good”, but does not provide further detail, with the one exception in leadership and management:  “Leadership and/or management require improvement because they are not good but are demonstrating the capacity to secure improvement in the school”.

The good, outstanding, and inadequate descriptions are more detailed.  In some cases, the description for a good grade in September 2012 overlaps with the description for a satisfactory grade in the January 2012 document.  In at least one instance standards for a September 2012 good school are lower than those of a January 2012 satisfactory school.  In the September 2012 document a good grade for pupil attainment requires that “Taking account of their different starting points, the proportions of pupils making and exceeding expected progress compare favourably with national figures.  Where the proportion making expected progress overall is lower than that found nationally, it is improving over a sustained period”, while in January 2012, even a satisfactory grade required that “Pupils are progressing at least as well as all pupils nationally given their starting points.”  Similarly, in January, a satisfactory description read that “In exceptional circumstances, where attainment, including attainment in reading in primary schools, is low overall, it is improving over a sustained period”, while a good description in September 2012 made no mention of this being exceptional and simply read “Where attainment, including attainment in reading in primary schools, is low overall, it is improving at a faster rate than nationally, over a sustained period”.  Obviously, in this case, elements of the previous good and satisfactory descriptors have been combined to create a new, broader good category, while the new requires improvement category is simply characterised as lacking good traits.  This leads to the classification of schools that would be graded satisfactory under the previous framework to be graded good under the new framework, inflating the proportion of good schools.

This is evidenced by the graph below, which was compiled from two sources of Ofsted data on the proportion of schools at each grade level at their most recent inspection as of 31 August of a given year.

 

The increase in the proportion of schools rated good or outstanding between 2010 and 2023 is primarily due to an increase in the proportion of schools rated good and decrease in schools rated requires improvement between 2012 and 2016, corresponding to the adoption of ‘requires improvement’ as opposed to ‘satisfactory’ as the grade 3 descriptor for overall effectiveness.  This shift took several years to become fully evident as not all schools are inspected each year; depending on their previous grading and a risk assessment, they are generally reinspected every 2-5 years.  Once most schools had been inspected under the new framework, around 2016, the proportion of schools rated good or outstanding remained relatively stable.  Throughout the period of interest, the proportion of schools judged inadequate remains virtually unchanged at 2 – 4 %.  The proportion of schools judged outstanding also remains largely stable at about 20%, although in recent years this proportion has fallen to 16% after the policy of exempting outstanding schools from reinspection was ended.  This has corresponded with a slight increase in the proportion of schools graded good, but little impact is evident in the proportion of schools rated requires improvement or inadequate.

This provides additional support for the argument that the increase in the proportion of ‘good or outstanding schools’ is because the meaning of ‘good’ changed in 2012 rather than because of “our fantastic teachers and the evidence-based reforms we’ve taken”, as Gillian Keegan claims in her post on X.  She goes on to claim that the conservative government is “delivering high and rising standards in our schools”.  This evidence suggests that, in the case of ‘good’ schools, the government has in fact lowered, not raised, standards, thus artificially inflating the proportion of schools rated good or outstanding.  I therefore request that you investigate these statements and provide guidance on whether they are misleading, and whether Ofsted statistics of this sort can be compared over time.

Yours sincerely

General secretary

National Education Union

Ed Humpherson to NEU: Comparisons of the proportion of good and outstanding schools over time

Dear Daniel 

Comparisons of the proportion of good and outstanding schools over time 

Thank you for your letter on 18 April regarding the comparison of the proportion of good and outstanding schools over time by the Department for Education (DfE). You raised several points about the comparison between 2023 and 2010, and whether this should be considered misleading.  

We agree that there are a number of factors that have likely influenced the number of good and outstanding schools over time. These include changes made to the inspection frameworks as outlined in your letter, as well as other factors such as the impact of Covid-19 and the exemption of outstanding schools from inspections between 2012 and 2020.  

It is our view that the proportion of good and outstanding schools can be a useful indicator of school performance over time. However, as with any high-level comparison, the nuanced nature of any changes may not be fully reflected when used across an extended timeseries. 

We have reviewed the information provided in Ofsted’s methodology information for state-funded schools inspections and outcomes. This document notes that “The overall effectiveness judgement has remained broadly comparable across the different frameworks since 2005” but also that “A range of factors affect both how and when we inspect and limit the comparability of inspection outcomes over time”. It is our view that these messages could appear to be conflicting to users. 

We welcome the updated line Ofsted has now published in the methodology information and hope that this will support future communications of these figures when comparisons are made over long periods. If you have any additional concerns or if you would like to discuss this matter further, then please contact us.  

Yours sincerely  

 

Ed Humpherson
Director General, OSR 

 

Ed Humpherson to Phillip Oppenheim: The UK’s standing in the tech and life sectors.

Dear Phillip,

Thank you for raising your concerns around statements made by the Chancellor of the Exchequer, Jeremy Hunt MP, on the UK’s standing in the tech and life sectors, as well as economic growth.

The Tech sector now sits with the Department for Science Innovation and Technology (DSIT), who informed us that the press release was produced by Burlington, a PR agency in collaboration with the Digital Economy Council (DEC) and Dealroom. The underlying data can be found on Dealroom via this link, based on the “Ecosystem value” column. We consider the specification of the claim could have been made clearer as market valuation is not necessarily synonymous with economic activity but has been  used by the Chancellor interchangeably.

HM Treasury (HMT) has explained to us that the Chancellor’s claim that the UK has the “best Life Sciences sector in Europe” was based on the equity finance raised by the life science industry, in which the UK was third to USA and China. The underlying data can be found in the Life Science Competitiveness Indicators (LSCI) 2022 data table, in Tab 27. However, we note that the UK has varied performance on other indicators and does not appear to be the best in Europe across categories. Again, we consider that the statement would benefit from being clear on the specific measure being used to make the claim.

You also raised concerns around the Chancellor’s claim that the UK was “the fastest-growing large European country, not just since the pandemic but since Brexit and since 2010”. This statement was unclear, containing ambiguity around ‘large’, time references, and the data source. However, assuming ‘large’ refers to G7 membership, Brexit refers to the year 2016, and the pandemic refers to the year 2020, the claim can be verified using Organisation for Economic Cooperation and Development (OECD) GDP data. Please note the OECD’s data is provided in in GDP levels, given in constant prices, OECD base year (2015).

Since 2010
Since Brexit (2016)Since COVID (2020)
France13.8%7.0%9.0%
Germany17.7%5.9%5.0%
Italy3.2%5.4%12.3%
UK20.9%7.6%13.4%

HMT informed us that claims on international comparisons are usually based on quarterly GDP data  from the relevant National Statistics Institute for each country and were subsequently referenced in the Spring Budget 2024 Data Sources document.

Transparency and clarity support public confidence in statistics and minimise the risk of misinterpretation of statistics and data. We are continuing to work with HMT to ensure future communications are clear and comply with our expectations of intelligent transparency.

Once again, thank you for contacting the Office for Statistics Regulation.

Yours sincerely, 

Ed Humpherson
Director General for Regulation

 

Related Links:

Phillip Oppenheim to Ed Humpherson: The UK’s standing in the tech and life sectors. (January 2024)

Phillip Oppenheim to Ed Humpherson: The UK’s standing in the tech and life sectors.

Dear Sir/Madam

I would appreciate you checking a claim made frequently by government ministers and the Chancellor about Britain’s status in tech and life sciences

I attach my research which is part of a book in preparation – I think that this definitely proves that the release sent out by a government department and the claims repeated by the Chancellor for nearly a year are false

I have written to the Chancellor twice since March last year, once recorded, with no response – my local MP promised to chase twice, but so far, nothing

Many thanks

Phillip Oppenheim

 

 

Dear Sir/Madam

I have sent you an email re a claim about UK tech which I would appreciate your checking

The second issue I would appreciate your checking is below

Claim by Chancellor Jeremy Hunt: “If you look at the fundamentals of the British economy we have had our setbacks, like everyone else, but we are the fastest-growing large European country, not just since the pandemic but since Brexit and since 2010”

The Times, Jeremy Hunt: ‘Britain doesn’t need to be famous for pessimism’,September 19th, 2023, repeated frequently since

– Taking 2010 as the base, the year the Conservatives came to power, the UK’s economic growth was 21.4%, marginally ahead of Germany at 20.3%, well ahead of France at 16.7% and Italy at just 4%

– But since the 2016 Brexit referendum, the UK has grown by 7.3%, less than France at 7.8% but more than Germany at 5% and Italy at 5.2%, so Hunt is not correct.

– Britain’s growth from the pandemic to Q3 2023 was just 1.4% – better than Germany’s 0.3% but worse than France at 1.7% or Italy’sand 3.4%. So again, Hunt is incorrect – Britain has not been the fastest growing large European economy since the pandemic, despite having the worst performance during the pandemic when GDP fell by 11% compared to 9% in Italy, the next worst performer. (In fact since the pandemic, Britain’s economic growth has been the second worst in the G7 group of large economies – well below Japan (2.4%), the eurozone (3%), the EU (3.4%), the G7 as a whole at (4.7%), the OECD (6%) and the United States (7.4%))

I look forward to your response

Phillip Oppenheim

 

 

Related Links:

Ed Humpherson to Phillip Oppenheim: The UK’s standing in the tech and life sectors. (April 2024)

Ed Humpherson to Veronica Hawking: NHS waiting lists in England

Updated on 25 January 2024 to correct factual error regarding who publishes official statistics on NHS waiting lists in England: incorrectly listed as DHSC, changed to NHS England

Dear Veronica,

NHS waiting lists in England

Thank you for your letter regarding comments made by the Department for Health and Social Care (DHSC) on NHS waiting lists in England. I am replying as the Head of the Office for Statistics Regulation, on behalf of Sir Robert Chote.

As you note, in an article in the Times on 4 January 2024, a DHSC spokesperson said: “Cutting waiting lists is one of the government’s top five priorities, and despite the impact of strikes, we have cut the total waiting list and the number of individual patients waiting for treatment.”

NHS England publishes monthly official statistics on NHS waiting lists in England. This includes data on ‘incomplete pathways’, which are waiting times for patients still waiting to start treatment at the end of the month, and the number of unique patients on NHS waiting lists.

At the time of speaking, the available official statistics on NHS waiting lists in England showed the total number of incomplete pathways and unique patients on NHS waiting lists fell between September and October 2023. The latest data published on 11 January show that the total number of incomplete pathways and unique patients on NHS waiting lists also fell in November 2023.

It would better support understanding if DHSC had been clear on what time period it was referring to avoid potentially misleading people. This is particularly important where the fuller time series of waiting lists data, as shown in the chart below, show an upward trend in waiting lists over the last year and could therefore be perceived as selectivity of data.

treatments-waiting-times

Consultant-led Referral to Treatment Waiting Times Data 2023-24, November 2023

The chart above shows the estimated number of incomplete pathways and the estimated number of unique patients waiting for consultant-led treatment with the NHS in England, from November 2022 to November 2023. The lines in the chart show a steady increase in both from November 2022 until reaching a peak in September 2023, and then falling.

Those speaking on behalf of Government should make it easy for people to understand and verify what data it uses to inform the public. Taking a transparent approach by default ensures confidence in public messaging is not undermined.

Yours sincerely,

Ed Humpherson