The quality of police recorded crime statistics for England and Wales

Published:
16 May 2024
Last updated:
23 May 2024

What we found

Police forces

Police forces have made significant improvements to crime recording, but there are common challenges to ensuring the quality of recorded crime data

Through our discussions with police forces and HMICFRS, we identified several common themes and features of good crime recording. We also gained insight into the barriers and challenges to recording crime accurately and consistently.


Police forces are recording crime more accurately now than in 2014

His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS) has a statutory responsibility to inspect the effectiveness, efficiency and legitimacy of police forces. HMICFRS has carried out regular inspections of the ‘crime data integrity’ of each police force in England and Wales since 2014. It introduced a rolling programme of crime data integrity inspections following a 2014 inspection of all police forces that found substantial under-reporting of crime.

HMICFRS inspects the crime data integrity of each police force roughly every three to five years. The inspections are widely seen as a measure of the extent to which forces are complying with Home Office Counting Rules.

HMICFRS examines whether the crimes reported to the police are recorded when they should be (the ‘crime recording accuracy’). To measure crime recording accuracy, HMICFRS takes a sample of incidents, based on the opening codes where they would expect to find a crime, to check if a crime has been recorded. It does this for three offence groups – violence against the person, sexual offences, and all other offences (excluding fraud, as this is recorded only by the City of London Police). It weights the results and calculates the recording accuracy for each offence group as well as the recording accuracy for all crimes (excluding fraud).

HMICFRS then gives a graded judgement on crime data integrity based on the force’s crime recording accuracy data and several other criteria, including other dip samples of crimes, the timeliness of crime recording and how forces manage and oversee crime recording. This is reported under the ‘Recording data about crime’ area in inspection reports.

HMICFRS’s inspections show that crime recording accuracy nationally has improved in the last ten years. In 2014, HMICFRS estimated that 80.5% (± 2.0%) of all crimes (excluding fraud) that were reported to the police in England and Wales were being recorded. As summarised in HMICFRS’s 2023 Police performance: Getting a grip report, by the end of its 2021 to 2022 inspection programme, HMICFRS estimated that crime recording accuracy had improved to 92.4% (± 0.3%) for all crime (excluding fraud), a statistically significant change.

The picture is more mixed when it comes to individual police forces, with some forces recording crime more accurately than others. Table 2 shows the crime recording accuracy figures and the integrity grading of crime data for the 23 forces assessed by HMICFRS in their 2021 to 2022 inspection programme (which ran from 2021 to March 2023). Most forces (17 out of 23 forces) were found to record more than 90% of all reported crime, with six forces recording more than 95% of all reported crime.

HMICFRS’s inspections of crime data integrity are the best available source of information on how well forces record crime. However, as they are based on a subset of crimes recorded in the most recent three months and take place every three to five years, they only provide a snapshot of crime recording accuracy at a single point in time.

 

Table 2. Crime recording accuracy figures from HMICFRS’s 2021 to 2022 inspection programme

Police forceCrime recording accuracy (%)'Reporting data about crime' grading
Avon and Somerset Police91.4Requires improvement
Cambridgeshire Constabulary93.2Good
Cheshire Constabulary93.1Adequate
Cleveland Police96.4Good
Derbyshire Constabulary92.5Requires improvement
Devon and Cornwall Police84.0Inadequate
Dyfed-Powys Police91.6Adequate
Gloucestershire Constabulary86.0Inadequate
Greater Manchester Police90.6Adequate
Hampshire and Isle of Wight Constabulary96.7Good
Humberside Police92.5Adequate
Kent Police96.7Outstanding
Leicestershire Police95.5Outstanding
Lincolnshire Police88.2Requires improvement
Merseyside Police93.8Good
Metropolitan Police Service91.7Adequate
Northumbria92.6Adequate
Nottinghamshire Police86.4Requires improvement
South Yorkshire Police96.4Good
Staffordshire Police88.4Requires Improvement
Sussex Police85.6Inadequate
Thames Valley Police94.9Good
West Midlands Police95.5Good

Source: HMICFRS inspection reports.

Notes: Not all police forces underwent a crime data integrity inspection in the 2021 to 2022 inspection programme. The ‘Reporting data about crime’ grading is based on crime recording accuracy data and a range of other criteria, including other dip samples of crimes, the timeliness of crime recording, and how forces manage and oversee crime recording.

 

Individual police forces have improved their crime recording accuracy to varying extents. Some forces, such as Kent Police and West Yorkshire Police, have maintained high crime recording standards in recent years, whereas other forces have not sustained improvements over time. In certain forces, crime recording standards have declined. For example, Sussex Police and Devon and Cornwall Police had overall crime recording accuracies of 94.6% and 93.4% in 2016 and 2017, respectively, but these dropped to 85.6% and 84% in the 2021 to 2022 inspection programme. Due to changes to the Home Office Counting Rules in 2023, it is not possible to compare findings from HMICFRS’s 2023-25 inspection programme and later programmes with earlier inspection findings.

Given this variation in crime data integrity between forces and over time, HMICFRS has said, in its methodology documentation for its crime data integrity inspections, that it will continue to inspect forces ‘to ensure standards are maintained and victims receive the service they deserve’. HMICFRS plays a vital role in holding forces to account for their crime recording standards, and we consider it is essential that it continues to carry out regular audits of crime data integrity.

The reintroduction of regular external auditing of crime recording in 2014 has been a key driver of improvements to crime data integrity. These improvements contributed to the rise in the number of reported crimes recorded by the police between 2015 and 2020.

The nine police forces that we spoke to confirmed that HMICFRS inspections of crime data integrity had been instrumental in bringing about improvements to their crime recording standards. In some cases, it has led to systemic change; for instance, several forces told us they overhauled their crime recording processes and practices following an inspection. Table 3 shows how crime recording accuracy has changed over time for those nine forces.

 

Table 3. Crime recording accuracy figures from HMICFRS’s inspections for the nine police forces we spoke to

Police forceCrime recording accuracy (%), 2016 to 2020 inspectionsCrime recording accuracy (%) 2021 to 2022 inspections% point changeStatistically significant change
Cleveland Police83.4 (±1.9)96.4 (±2.0)13.0Yes
Dyfed-Powys Police87.8 (±1.7)91.6 (±2.7)3.8No
Essex Police95.8 (±1.5)
Gloucestershire Constabulary81.6 (±1.9)86.0 (±2.8)4.2No
Greater Manchester Police85.5 (±1.9)90.6 (±2.8)5.1Yes
Hampshire and Isle of Wight Constabulary91.3 (±1.4)96.7 (±1.5)5.4Yes
Kent Police83.6 (±1.9)96.7 (±1.9)13.1Yes
Lancashire Constabulary84.3 (±1.993.3 (±1.5)
Metropolitan Police89.5 (±1.6)91.7 (±2.4)2.2No

Notes:

  • Crime recording accuracy (%), 2016 to 2020 inspections95% confidence intervals from HMICFRS inspections.
  • Statistically significant changeTo determine if a change was statistically significant at the 95% level, we checked for overlapping confidence intervals. If the confidence intervals of the two crime recording accuracy estimates do not overlap, we assumed that the change is statistically significant.
  • Essex PoliceHMICFRS did not inspect Essex Police’s crime data integrity in the 2021 to 2022 inspection programme.
  • Lancashire Constabulary HMICFRS did not inspect Lancashire Constabulary’s crime data integrity in the 2021 to 2022 inspection programme. The higher figure comes from a re-inspection in May 2019. It is included to show the force’s improvement to crime recording accuracy.


Key features of good crime recording

In the forces we spoke to, those that have changed their recording standards have significantly improved their crime recording accuracy and other aspects of crime data integrity. Our findings may not be generalisable to all police forces, as they are based on a sample of nine forces, but we think that they give us a good indication of what is working well. This review has given us greater confidence in the quality of the underlying recorded crime data overall.

Strong data leadership and governance

We found that there has been a positive shift in the culture around crime recording in police forces since 2014. Now, forces appear take crime data integrity very seriously and are more committed to ensuring that they meet the national standards of crime recording. The data culture in a police force is set by the senior leadership, in particular, the chief constable, who plays a critical role in driving improvements. Our conversations with chief constables highlighted that many forces are starting to treat their data as a strategic asset. Chief constables, force crime registrars and other police staff that we spoke to strongly emphasised the importance of accurate and reliable recorded crime data for effective police operations and delivering a good service to victims. Police forces use recorded crime data to allocate resources, monitor and improve performance, and analyse patterns of crime.

In general, the police forces that are recording crime well have clear data governance arrangements in place. Most forces we spoke to now have a crime data standards board or crime data integrity board, with a representative deputy chief constable, assistant chief constable or superintendent. The boards enable oversight and scrutiny of changes to the Counting Rules and data quality issues and provide a channel for escalating concerns to senior officers. They have been effective in driving quality improvements, such as changing recording processes to improve the recording of certain crime outcomes. Senior officer representation on the boards supports cultural change; it highlights to officers and staff that accurate crime recording is a priority for the force.

We also found that good data leadership and governance minimises the risk of manipulation or gaming of recorded crime data. Police forces are now much stricter about having a clear separation between crime recording teams and performance teams. One force described this separation as a ‘sterile corridor’ between the two areas. As highlighted in the PASC Inquiry’s report, police forces used to be more target-driven, and crime recording was sometimes linked to performance targets. This created perverse incentives for officers to record fewer crimes. Having an independent force crime registrar who sits outside the performance monitoring process increases the integrity of the recorded crime data.

Investment in people, processes and systems

Police forces told us that investing in the people, processes and systems for recording crime is critical to ensuring compliance with the Home Office Counting Rules and generating high-quality recorded crime data.

One key aspect of this investment is training and guidance. Many forces that we spoke to have strengthened their training programme and developed new guidance on crime recording for police officers and staff to build knowledge and tackle poor recording practice. Such training has been effective in addressing issues with the recording of certain crime types, such as burglary, theft from the person and domestic abuse offences. Regular training and clear guidance support more-accurate and more-consistent crime recording.

One impactful way to improve crime recording accuracy is to adopt a ‘centralised’ model of crime recording. This is where a dedicated crime management unit within the force is responsible for carrying out data quality checks and finalising crime records. This model is seen as best practice by many of the forces that we spoke to. For example, one force told us it recently moved to centralised crime recording because it saw that it was working well in other forces. Several forces explained that they used to record crime centrally but moved away from this model due to budgetary constraints; they have since re-introduced it or would like to re-introduce it.

Centralised crime recording has several advantages. Police forces with crime management units have greater control over data quality than police forces that rely on frontline officers to enter data. There is a strong focus on ‘getting the data right the first time’, so that fewer data quality checks need to be carried out on the information that is entered on the system. Centralisation minimises differences in the interpretation of the Counting Rules within the force. A team of tens or hundreds (depending on the size of the force) of dedicated, experienced crime recorders is more likely to apply the rules consistently than thousands of frontline officers who are less familiar with the rules and have a range of other responsibilities. Centralisation can also have operational benefits. For instance, it can support forces to provide a better service to victims and conduct investigations to a higher standard.

However, centralisation is not the only model for recording crime, and it is not the only factor that determines whether forces record crime accurately and consistently. For example, the Metropolitan Police has had a crime management unit for many years, but its crime data integrity has not improved recently; it is currently rated as ‘Adequate’ by HMICFRS. Even with a crime management unit, police forces may not be able to validate all crime records due to the volume of crimes recorded and resource constraints. Some police forces have introduced ‘point of call’ crime record validation, where a crime is recorded from an incident as close as possible to the call to the force, but they told us that this requires significant resource.


Case studies of improvements to crime recording

The case studies below outline the steps that three forces took to improve their crime recording standards.

Gloucestershire Constabulary

Until recently, Gloucestershire Constabulary had poor crime recording standards. The force was consistently rated as Inadequate by HMICFRS, with a 2021 inspection estimating that crime recording accuracy was 86.0%. A 2023 inspection estimated that the force now has a crime recording accuracy of 97.6%, although the figure is not directly comparable with earlier estimates due to changes to the 2023 changes to the Home Office Counting Rules.

The force told us that it put in place a range of measures to improve its crime recording standards. One of the most effective changes has been the significant increase in the size of its crime management unit. The force developed the unit into a Crime Standards Bureau, which allows it to record crimes as close as possible to the ‘point of call’. The force delivered extensive training to new staff recruited to the Crime Standards Bureau and developed a rolling programme of refresher training events.

The force currently has a team of two auditors, who provide an additional layer of oversight and stress testing, including on crime recording, and is training a further three members of staff to become qualified auditors. The auditors are situated within the force’s Governance and Compliance department, which is independent from the rest of the organisation, in line with National Police Chiefs’ Council best practice.

Cleveland Police

Cleveland Police is currently rated as Good for crime recording by HMICFRS. The most recent inspection of crime data integrity, carried out for the 2021 to 2022 inspection programme, estimated that crime recording accuracy was 96.4%. This was a significant improvement from a 2018 inspection, which estimated that crime recording accuracy was 83.4%.

To achieve this improvement, the force completely overhauled its crime recording teams, processes, governance and training. It told us that, once it had a process and management structure in place, crime recording standards started to improve substantially.

The force established a ‘Gold’ governance group that is attended by senior officers. This group evolved into a crime governance group, which meets bimonthly, and a tactical group, which meets monthly. Crime data integrity is addressed at every crime governance group meeting, and there is a clear channel to escalate crime recording issues. The force developed a performance dashboard to monitor inspection compliance and findings from internal crime record audits carried out by the force crime registrar. As part of the organisational restructuring, a new set of standard operating procedures for crime recording was developed by a consultancy. The force has also reintroduced a crime management unit.

Greater Manchester Police

In December 2020, Greater Manchester Police was placed into ‘special measures’ by HMICFRS, following an inspection that identified a range of concerns, including poor crime recording. The inspection found that crime recording accuracy was just 77.7%. Since then, the force has significantly improved its crime recording standards. An inspection carried out in 2021 to 2022 found that crime recording accuracy had increased to 90.6%. As a result of this and other improvements, HMICFRS removed the force from special measures in October 2022.

HMICFRS found that the force has improved the oversight and scrutiny of its crime recording processes. Strong leadership and governance arrangements played a key role in this.

The force strengthened its governance of crime recording by establishing a crime data integrity board, which is chaired by an assistant chief constable. It meets monthly to discuss all aspects of crime data integrity and compliance with the Home Office Counting Rules and National Crime Recording Standard. The force also set up a data standards board, which is chaired by a superintendent. This board looks holistically at data quality issues across the force, including for recorded crime data.

The force has invested significant resource in additional data quality checks and has rolled out additional training for staff on crime recording standards and processes. In August 2023, the force established a crime management unit, which will likely further improve crime recording standards.

One of the main barriers to further improvement is the force’s crime recording IT system, which the force told us does not currently meet its recording needs.


There are common challenges to ensuring the quality of recorded crime data

Through our engagement and desk research, we identified some common challenges to further improving crime recording and the quality of recorded crime data, in particular to achieve consistency of data across police forces.

These challenges can be attributed to the complexity of police crime recording in England and Wales – there are 44 police forces that potentially manage their crime recording in different ways. The quality of police recorded crime data is influenced by many factors, including the decisions made by police officers and staff when recording a crime; changes to the Home Office guidance on when a crime should be recorded; the different systems and versions of IT systems used to record crime; and the extent of quality assurance applied to the data.

These factors may interact differently across police forces and over time. Police force-level quality issues may, depending on the extent of differences between police forces, impact the overall police recorded crime data across England and Wales. For instance, in the year ending December 2023 the crimes recorded by the Metropolitan Police recorded accounted for around one-sixth of all crimes (excluding fraud and computer misuse) recorded by the police in England and Wales. Any quality issues in the Metropolitan Police’s data will have a disproportionate influence on the national statistics.

Due to the nature of the police recorded crime data – they are administrative data primarily collected for operational purposes – it is impossible to eliminate all data inaccuracies and inconsistencies across police forces, but there are ways in which they are being minimised. We give some examples below.

Differences in the interpretation of the Counting Rules

As with any set of rules, there can be a degree of subjectivity in interpreting the Counting Rules. The introduction of the National Crime Recording Standard in 2002 has led to more-consistent crime recording across police forces, but there remain differences in the interpretation of the Counting Rules both within and across police forces.

Most of the forces that we spoke told us that they see the complexity of the Counting Rules as the main cause of differences in interpretation. HMICFRS said it agreed that there are grey areas in the rules that may affect how crimes are recorded. The Counting Rules are a balancing act – they must be simple enough to be understood by police officers and staff, but not so simple that they reduce the level of quality assurance that forces apply to the data, which affects data quality.

Differences in the interpretation of the Counting Rules can vary by crime type and are often due to a lack of understanding of the offence. Police forces and other stakeholders gave several examples of this:

  • Domestic abuse offences are vulnerable to differences in interpretation. Several forces told us that not all police officers understand how to apply the domestic abuse ‘flag’ to a crime record. Flags are added manually to a crime record to add context about the crimes. Other flags, such as those for online crime, are also often applied inconsistently within and across police forces.
  • There are multiple categories of stalking offences, which have different maximum sentence lengths (six months or ten years). If a police force records the incorrect crime, it is allocated to the wrong officer; for offences with the longer sentence length, the expectation is that a detective investigates. This is the subject of a current super-complaint from the Suzy Lamplugh Trust.

As highlighted earlier, several police forces have rolled out training to improve the understanding of the Counting Rules and recording of specific crime types, including domestic abuse-related offences. Having centralised crime management can also support more-consistent crime recording, as fewer staff members need to know the rules in depth.

Minimising differences in the interpretation of the rules across police forces can be challenging. One effective approach is to share knowledge and learning. All force crime registrars are part of a regional group, which acts as a forum for discussing interpretations of the Counting Rules. The regional groups are proactive in trying to ensure that the rules are applied consistently. One such discussion we were told about was on malicious communications offences. A regional group discussed the lack of a legal definition of ‘grossly offensive’ language and agreed on a common approach for all police forces in the group.

Force crime registrars can also ask the National Crime Registrar (a Home Office employee) for advice. All queries and answers about the Counting Rules are posted on an online knowledge sharing platform hosted by the Police Digital Service, and regional force crime registrar representatives are alerted when a query is posted.

Changes to the Counting Rules

The Counting Rules are regularly reviewed and updated. In 2023, they underwent a major review as part of a wider review of police productivity in England and Wales.

The review led to some changes. The main change was a reversal of a previous change made in 2017. Prior to 2018, only the most serious crime was recorded for incidents that involved more than one type of crime. This was called the principal crime rule. The change in 2017 mandated that, in addition to the most serious crime, where a crime of stalking or harassment was disclosed, this would also be recorded. The 2023 change reintroduced the principal crime rule for all offences, except for modern slavery offences and passport application fraud, which are still exempt. The police still investigate all offences involved in the incident, but not all offences are recorded.

The review also led to changes to how burglary offences and public order offences are recorded. In particular, Section 5 (Public Order Act 1986) offences are now no longer notifiable. The second phase of the review, currently underway, is looking at the framework used to record the outcomes of crime investigations.

We asked police forces how these changes have affected crime recording practices. We found that, overall, the changes have been well received by chief constables, frontline staff and the majority of force crime registrars, who see them as sensible and long overdue. All forces that we spoke to said they had been well informed about the proposed changes and had opportunities to contribute to their development, for example, through attending the regional force crime registrar meetings. The National Crime Registrar is continuing to work with police forces to ensure that the changes have no unintended consequences.

Some police forces that we spoke to did highlight concerns about the impact of the 2023 changes on data quality. For example, one force mentioned that the nuance of domestic abuse offending patterns may be lost, as now only one crime must be recorded per incident. The Domestic Abuse Commissioner has expressed similar concerns publicly.

Changes to the Counting Rules are an inevitable aspect of crime recording in England and Wales. The Home Office updates the rules to ensure that they remain relevant and fit for purpose. It told us that many of the changes made in recent years sought to improve the utility of the data or reflect changes in legislation (by incorporating new offences).

Changes do not always support more-accurate crime recording. We were told that the principal crime rule change in 2017 led to over-recording of stalking and harassment offences as the incident rule was difficult to manage, and this distorted the level of offending. The reversal of this change should support more-accurate data on the volume of these offences handled by the police. The Home Office, HMICFRS and police forces told us that the reversal has led to a decrease in the number of malicious communications offences linked to harassment and stalking recorded by police forces.

One consequence of regular changes to the Counting Rules is that it makes it difficult to determine whether a change in the number of crimes recorded by the police is genuine or whether it is due to a change in crime recording practices. This issue affects everyone who uses police recorded crime data: it makes performance monitoring more difficult for police forces; it complicates explaining trends in police recorded crime statistics for ONS and the Home Office; and it hinders HMICFRS’s ability to monitor improvements, as it is no longer possible to directly compare the findings from audits under the old rules and those carried out under the 2023 rules. In other words, regular changes to the Counting Rules reduce the value of the police recorded crime data for a range of users.

Continued improvements to crime recording processes and practices since 2014, variation in police activity and increased confidence among victims to report crimes to the police add additional uncertainty to interpretation of police recorded crime trends. The Crime Survey for England and Wales estimates are not subject to such changes over time, which is why they are a more reliable measure of long-term trends in crime.

It is good that ONS published a summary of the 2023 changes to the Counting Rules in the Crime in England and Wales statistical bulletin. This information could be expanded to help users understand the full set of changes to police crime recording processes. ONS and the Home Office should continue to work together to monitor and explain the impact of the Counting Rules changes on the statistics to users.

Crime over-recording

Several stakeholders that we spoke to, including some police forces and HMICFRS, explained that the pressure to secure a positive outcome in inspections has led some police forces to adopt an approach of ‘better record a crime in case HMICFRS fails us’. This risk-averse behaviour can lead to over-recording of crime. For instance, one force said it started ‘recording everything’ after receiving a poor crime data integrity grading. This substantially increased the force’s crime recording accuracy, but also led the force to record crimes that were later disproven and had to be cancelled on the crime recording system. One of the National Crime Recording Standard’s basic principles is that, once a crime has been recorded by the police, it will remain as a crime unless there is ‘additional verifiable information’ that proves it did not happen. We were told that over-recording is more common for some crime types than others, such as stalking and harassment, domestic abuse and malicious communications.

Strict adherence to the Counting Rules can make a force crime registrar reluctant to authorise the cancellation of crimes, and this can also lead to over-recording of crime. This has become more common for rape cases in recent years, as outlined in Operation Soteria Year One Report. Operation Soteria is a collaborative programme bringing together police forces with academics and policy leads to use evidence and new insight to enable forces to transform their response to rape and serious sexual offences.

Operation Soteria examined the quality of police recorded rape data. The Year One report explains that HMICFRS found that police forces were under-recording sexual offences, including rape, and cancelling crimes that should not have been cancelled. Now, police forces are recording rape more accurately and consistently, but they are also more cautious. Police staff members often do not request the cancellation of a rape because the process is perceived to be too onerous. Furthermore, the force crime registrar may not authorise the cancellation because the bar to cancel a crime is so high. Several police forces that we spoke to gave examples of rape offences that they had recorded but that they thought had not occurred and were not cancelled. The 2023 review of the Counting Rules changed the nature of ‘additional verifiable information’ that is needed for a force crime registrar to cancel rape offences, which should minimise over-recording.

It is difficult to quantify the scale of over-recording. HMICFRS examines over-recording in its crime data integrity inspections and shares its findings with police forces, but these findings do not influence the overall grading. Forces are only penalised for under-recording. HMICFRS told us that it estimates that several forces are currently over-recording crime in some way.

Variation in crime recording IT systems

Police forces are responsible for procuring their own crime recording IT system. We found that variation in crime recording IT systems is a barrier to standardising crime recording practices and improving data quality across police forces in England and Wales.

There are currently seven different crime recording IT systems in use across police forces in England and Wales, with Niche (26 forces), Athena (nine forces) and Connect (four forces) being the most common. Even if police forces are using the same system, they may be running a different version of the system. We were told that at one point there were over ten different versions of Niche in use. Different crime recording systems may work in different ways, and police forces may have customised their system according to their specific operational needs. For example, police forces have their own set of crime opening codes, which are used to indicate the nature of the incident, and different systems may have different ways of applying flags to crime records.

We found that each system has specific data issues and challenges. Several forces we talked to are, or were, recording crimes in systems that are old and expensive to upgrade. For example, the Metropolitan Police explained that it had been using its CRIS system since the 1990s and only rolled out a new system (Connect) in 2023. The age of the system can affect data quality. Another force told us that its old crime recording IT system was a major factor in its historical inaccurate crime recording.

Police forces highlighted some data challenges with more-modern crime recording systems. For instance, Athena tends to auto-complete certain fields with incorrect information and Connect is not an intuitive system to use. One force that had recently transitioned to Niche said that it had been having issues with mapping data onto the new system.

To add to this complexity, police forces regularly change their crime recording systems. This can have a negative impact on data quality while the staff adjusts to the new system. Occasionally, changing systems can cause significant disruption in sending data to the Home Office. For example, after implementing a new crime recording IT system, Greater Manchester Police was unable to submit police crime data to the Home Office for the period July 2019 to March 2020. To address this problem, the force told us it had to implement costly and labour-intensive workarounds. Similarly, Devon and Cornwall Police was unable to submit data to the Home Office for the period October 2022 to December 2023 because it changed its crime recording IT system. Ultimately, these disruptions impact the value of the statistics – missing data mean that the statistics are not providing a full picture of crime recorded by the police in England and Wales.

Systems can also limit opportunities for data quality improvements. For instance, some forces have not rolled out certain National Data Quality Improvement Service (NDQIS) tools for flagged data collections due to challenges of integrating the tools with their crime recording IT system.

Some forces have control over their own data systems and can make changes to their crime recording IT system relatively quickly, whereas other forces rely on external suppliers to make the changes for them, which can take longer and be costlier. One way in which forces are minimising this risk is working together to manage the IT supplier. For instance, the chief constables of all nine forces using Athena are managing the IT supplier as a group. This coordination ensures that the forces receive the same upgrades and supports more-consistent crime recording.

However, we found that the sharing of knowledge about crime recording IT systems between forces could be improved. For example, we were told about one force that was unwilling to share a coding script for extracting data from Niche with another force that had recently moved to Niche. This meant the force had to commit resource to developing the script from scratch. This duplication of work could have easily been avoided if the forces involved were more open to sharing and learning from each other.

We are not confident that the Home Office understands the strengths and limitations of the different crime recording IT systems in use by police forces in England and Wales, or how variation in those systems may be impacting the quality of police recorded crime data. To strengthen its oversight of police force data quality, the Home Office should work with polices forces to gain this understanding.

Recommendation 1:

To promote more-consistent and more-efficient use of crime recording IT systems, police forces should work more collaboratively and improve knowledge sharing about systems.

Recommendation 2:

To strengthen its oversight of police force data quality, the Home Office should work with police forces to gain an understanding of the strengths and limitations of the different crime recording IT systems, and how variation in systems impacts data quality.

Variation in quality assurance arrangements

While we did not review in depth each force’s quality assurance process for recorded crime data, we found that the stages are broadly similar. Usually, they include:

  • Automated checks of information entered by police officers on mobile data terminals (for forces that use these).
  • Manual reviews of crime records carried out by a dedicated crime management unit or data quality team before the crime record is finalised.
  • Audits of crime records carried out by the force’s crime registrar to monitor crime recording accuracy and compliance with the Home Office Counting Rules and the National Crime Recording Standard.

In general, the internal audits follow a similar format to HMICFRS’s inspections of crime data integrity. Forces take separate samples of violence against the person offences, sexual offences and all other offences. Crime registrars also carry out regular thematic audits of specific crime types such as theft against the person offences or audits of flagged data collections, such as domestic abuse-flagged crimes. These audits provide vital information on crime recording accuracy and can highlight areas of poor recording practice.

To ensure that police forces are carrying out internal audits in a consistent way and to a high standard, the Home Office developed a nationally recognised Data Quality Assurance Manual. The manual sets out a minimum standard framework that outlines the processes and activities that forces should have in place, or consider implementing, to improve data quality to be in line with the National Crime Recording Standard and the Home Office Counting Rules.

The Data Quality Assurance Manual recognises that there may not be a ‘one size fits all’ approach to quality assurance, and that processes may be tailored to forces’ different needs and local improvement activity. The manual encourages police forces to take a risk-based, proportionate approach to auditing recorded crime data to ensure that quality assurance and audit activity are focused on those areas with the greatest concerns or risks to quality. We support this proportionate approach to quality assurance.

The target audience of the Data Quality Assurance Manual is force crime registrars. Most force crime registrars that we spoke to said that they find the manual helpful and consult it regularly, especially for planning their crime record audit programme. The manual is currently being reviewed by the Home Office to ensure the information is up to date and fit for purpose.

We found that the standard of quality assurance applied at the first two stages (automated checks and manual reviews of crime records) is more variable across police forces. It is up to forces to decide the extent to which they check and validate crime data.

The quality assurance arrangements in individual forces are usually shaped by operational priorities and resourcing. One force told us it is not possible to conduct continuous quality assurance on crime data as they are entered onto the system, so it must choose where to focus its efforts. We heard an example of two police forces, using the same crime recording IT system, that have adopted a different approach to quality assuring flagged data collections; one force carries out regular checks on the flags that have been applied, while the other does not. Differences in the crime recording IT system may also influence the level of quality assurance that is applied, for example, if more automated checks can be built in some systems. Often, police forces must pay IT suppliers extra to build in additional checks.

Many police forces are starting to invest in data professionals. For instance, several police forces now have a Chief Data Officer. These individuals have a wider remit than the force crime registrar and focus on wider data quality issues in the force, including those affecting recorded crime data. We welcome the strengthening of data quality capabilities by police forces.

To support the consistency of quality assurance arrangements across police forces, we encourage police forces to improve knowledge sharing on quality assurance, to learn from each other and promote best practice.

Recommendation 3:

To promote best practice around quality assurance of recorded crime data, police forces should improve knowledge sharing on the checking and validation of crime records.

Automated data quality checking and data cleaning

An increasing number of police forces, including a couple of forces that we spoke to, have introduced ‘robotic process automation’ tools as part of their quality assurance process to enhance recorded crime data quality. For example, some forces use a tool which identifies crime records that need to be manually reviewed and checks whether crime outcomes have been applied correctly. These tools can also check for missing information and fill in the gaps if these data are available in other systems. They are being applied to a wide range of policing datasets, including incident data and personnel data.

Although this type of tool is automated, police forces told us that the outputs are regularly monitored and audited by a dedicated analyst and that there are manual contingency plans in place in case the tool produces unusual results. Some of the forces we spoke to told us they have plans to further develop their automation tools to detect other crime recording issues.

We welcome the use of these automated tools as they can enhance the level of quality assurance that is applied while reducing the burden on police staff. However, there is a risk that forces become too reliant on automated tools and fail to identify and address the root causes of the data quality issues. Where possible, forces should use these tools to identify systemic issues and improve the design of crime recording systems, or to tackle cultural issues linked to crime recording, to ensure that the data are as accurate as possible.

Back to top

The Home Office

The Home Office’s quality assurance processes are well established, but it should strengthen its oversight of police force data quality

The Home Office collates the recorded crime data from the individual police forces in England and Wales, and ONS then publishes these data as the police recorded crime official statistics.

In the next two sections we examine Home Office and ONS oversight of data quality and their communication of the quality of the statistics. We found that both the Home Office and ONS have strengthened their quality assurance arrangements in recent years, and that data quality is explained more clearly to users than it used to be. We also found gaps in the Home Office’s and ONS’s understanding of police force data quality and in the published information about quality.


Home Office’s quality assurance processes have been strengthened with the creation of the Home Office Data Hub

The Home Office has two main quality assurance processes for the police recorded crime data: a monthly process, which involves a series of logic and consistency checks; and a more-detailed quarterly reconciliation process for the data sent to ONS for publication, which involves a series of further checks. If errors are identified during these processes, police forces are asked to resubmit data once the errors have been corrected. These quality assurance processes also check the quality of the crime outcomes data, a related dataset submitted by police forces on the actions they have taken in response the crimes recorded.

Police forces submit data to the Home Office automatically (via the Home Office Data Hub, a case-level policing and crime database, which takes direct extracts from forces’ crime recording systems) or manually (via spreadsheets). Currently, 40 out of 43 territorial police forces in England and Wales are on the Data Hub. One of the main reasons a force is unable to submit data via the Data Hub is because it is currently changing or has recently changed its crime recording IT system and it cannot submit the data in the right format.

The creation of the Data Hub has improved police recorded crime data quality in two ways. The provision of record-level data allows the Home Office to carry out more-thorough quality assurance of the data. It also reduces the risk of human error and other errors in manual spreadsheet-based calculations. The Home Office is limited to carrying out trend and consistency checks on aggregate-level data if police forces submit data manually.

The police forces that we spoke to were positive about the quarterly data reconciliation process. They told us that it works well and that they have a good relationship with the Data Hub team. The process is regularly reviewed and updated to reflect changes to the Counting Rules. For example, the Home Office introduced additional checks for burglary offences, which were changed as part of the recent review of the Counting Rules, to ensure that forces are recording them correctly. This information can help forces identify and address poor recording practice.

In the last few years, the Data Hub team has automated the extraction of data by implementing Reproducible Analytical Pipeline (RAP) principles. This has freed up time for more quality assurance, further enhancing the quality of the data. We welcome the Home Office’s adoption of RAP principles in its work.


The Home Office has a limited understanding of how police forces quality assure their own data

The Home Office told us that it expects police forces to quality assure their recorded crime data, but it does not monitor the nature or extent of the quality assurance applied. The data reconciliation process only asks forces to confirm if the data they submit are accurate and suitable for publication. Because of this knowledge gap, we consider that the Home Office has insufficient oversight of police force data quality, which poses a significant risk to the quality of the statistics.

The Home Office, as the organisation that collates and supplies data to ONS, is responsible for understanding how police forces manage the quality of their recorded crime data. This includes forces’ quality assurance arrangements. We expect the Home Office analytical team to work with police forces to build its knowledge of police forces’ quality assurance arrangements.

Once the Home Office has gained a good understanding of police forces’ quality assurance arrangements, it should develop a plan for how it will support greater consistency of quality assurance across police forces. As explained earlier, we found that the standard of checking and validation of crime records varies across police forces. More-consistent quality assurance would enhance the quality of the police recorded crime statistics.

To support this work, the Home Office analytical team should consult the National Police Chiefs’ Council (NPCC), which is working to introduce more-standard approaches to data quality, including data validation, across police forces. Annex A gives an overview of the national policing coordination bodies and groups leading this work.

We recognise that this is a significant task. It will require resource and ongoing engagement with police forces. However, we see it as critical to strengthening oversight of police force data quality. By demonstrating that it understands, and has confidence in, the quality of police force data, the Home Office also promotes public trust in the data and crime recording processes. Restoring trustworthiness is particularly important given the concerns about under-recording of crime and manipulation of crime data by police forces in the early 2010s that led to the removal of the accreditation of the police recorded crime statistics.

The Home Office needs to engage with the Quality Assurance of Administrative Data (QAAD) framework, our regulatory standard for the quality assurance of administrative data. The framework provides a toolkit for statistics producers in making judgements about the continued suitability of administrative data for producing statistics. It was developed by the UK Statistics Authority following the Public Affairs Select Committee’s 2014 Inquiry into police recorded crime statistics, which found that there was a lack of regulatory oversight of the quality of the police recorded crime statistics and other official statistics based on administrative data. Therefore, we see applying the QAAD framework as essential to enhancing oversight of the quality of the police recorded crime data.

According to the QAAD toolkit, the police recorded crime statistics require a comprehensive level assurance, due to the high risk of quality concerns and the high public interest in the statistics. At this level of assurance, the toolkit expects the statistics producer to investigate the administrate data quality assurance arrangements, identify the results of independent audit and publish detailed documentation about the assurance and audit. Recommended activities for the ‘Quality assurance principles, standards and checks applied by data suppliers’ practice area include describing data suppliers’ principles, standards (quality indicators) and quality checks. For the police recorded crime statistics, that covers both the Home Office’s own quality assurance arrangements and police forces’ quality assurance arrangements.

Recommendation 4:

As a first step to greater assurance of the quality of police recorded crime data, the Home Office should gain a better understanding of police forces’ quality assurance arrangements.

Recommendation 5:

The Home Office should then develop a detailed plan on how it will support greater consistency of quality assurance across police forces. The Home Office should use our Quality Assurance of Administrative Data (QAAD) framework to guide this work and ensure that all the relevant quality areas are covered. Stakeholders, such as the National Police Chiefs’ Council, should be consulted as part of this work.


The use of information on crime data integrity should be maximised

The Home Office’s quality assurance processes check a range of data quality dimensions, including completeness, consistency and validity, but they do not check the accuracy of the recorded crime data. Using the data that are sent to them, the Home Office cannot check whether police forces have recorded the crimes reported to them when they should. This requires an audit of crime records, and only HMICFRS performs this function. Therefore, the Home Office and ONS are reliant on HMICFRS for monitoring and reporting crime recording accuracy.

As explained earlier, HMICFRS’s inspections of crime data integrity are based on a sample of crimes recorded in the most recent three months and carried out roughly every three to five years. For most police forces, the inspection reports are the only source of information about crime recording accuracy. Some police forces voluntarily share the findings of their internal crime recording audits with the Home Office. These findings give the Home Office a more-current picture of crime recording accuracy in those forces. However, in general, the Home Office and HMICFRS do not know how accurately police forces are recording crime in between inspections.

Recommendation 6:

To develop the most comprehensive and up-to-date picture of crime data integrity in police forces, the Home Office, HMICFRS and ONS should work together and use all available data, including HMICFRS inspection findings, HMICFRS management information and Home Office intelligence.


Governance groups support consistency of crime recording

The Home Office coordinates two governance groups that are responsible for ensuring consistency in crime recording in England and Wales.

The National Crime Recording Strategic Steering Group meets regularly to review the Home Office Counting Rules and make recommendations for changes. The group reports directly to ministers. The steering group is attended by the National Crime Registrar, the Home Office Crime Statistics Programme Director, the ONS Heads of Centre for Crime and Justice, representatives from police forces (force crime registrars) and the National Police Chiefs Council, and representatives from the Ministry of Justice and the Crown Prosecution Service.

The National Crime Recording Technical Working Group is a subgroup of the steering group. The working group considers tactical aspects of crime recording, such as how to implement changes to the Counting Rules. It is attended by Home Office statisticians, ONS statisticians, representatives from police forces (a representative from the regional force crime registrars network) and HMICFRS.

There are several cross-force steering groups for specific data quality initiatives and tools. For instance, there is an NDQIS steering group.

There are also a range of national policing coordination bodies and groups that aim to improve the quality of policing data to meet the operational needs of police forces. These are summarised in Annex A.


The National Data Quality Improvement Service is leading to quality improvements for certain crime types

Police forces had raised concerns with the Home Office about the variable data quality of the flagged data collections, due to the inconsistent application of flags by police officers and staff. Flags are added to crime records to provide context about the crimes. Flagged data collections include knife crime, domestic abuse-related crime, and online crime.

To improve the quality and comparability of knife crime data, in 2020 the Home Office established the National Data Quality Improvement Service (NDQIS). NDQIS uses a computer-assisted classification tool to review crime records held by the police. It scans data fields, including free text fields, and examines them using a simple ruleset and dictionary of key words (such as ‘knife’ and ‘stab’) to determine whether an offence involved a knife or sharp instrument. Each crime record processed by the tool is then allocated to a category:

  • High confidence – if the tool is certain that a knife or sharp instrument was involved, a knife crime flag is automatically added to the record.
  • Low confidence – if the tool is unsure if a knife or sharp instrument was involved, the record is marked for manual review.
  • Rejected – if the tool is certain that no knife or sharp instrument was involved, the record is rejected.

A similar tool has been rolled out for the domestic abuse and child sexual abuse collections and is in the process of being rolled out for the online crime collection. The Home Office told us that 41 out of 43 territorial police forces in England and Wales are currently using the knife crime tool, 30 forces are using the domestic abuse tool, and 29 forces are using the child sexual abuse tool. The Home Office is also developing a tool for hate crime-flagged offences and has plans for other flagged offences.

We reviewed the new knife crime tool and its impact on the statistics in 2022. We found that the tool is increasing the accuracy, consistency and comparability of flagged crime data between police forces and therefore has improved the quality of knife crime statistics published by ONS. The Home Office told us that it thinks the tool will be more impactful for some crime types than others. For example, it predicts that the quality of the data on online crimes will benefit most from the tool due to the poor application of this flag by police forces.

The stakeholders we spoke to were positive about the tool’s impact on data quality and the reduction in the burden of quality assurance. However, we also received feedback from some police forces and other policing stakeholders on the tool. One of the main limitations of this tool is that these flagged collections make up a relatively small proportion of all crimes, so the quality improvements are narrow in scope for the moment. Another limitation is that the tool is reliant on certain information being recorded by police forces; for example, if a crime record has not been generated when it should have been, it will not be checked by the tool. Developing the NDQIS tool continues to be time consuming and labour intensive, and it can take police forces some time to onboard the tool. Therefore, while the quality improvements can be significant, they are incremental.

Our 2022 review recommended that the Home Office publish an NDQIS development plan to alert users and other stakeholders about current and future developments. It is disappointing that this has not been published.

The future governance of the NDQIS programme is currently under consideration as part of a broader strategy that is looking at the governance of police IT systems. Given this uncertainty around governance, it is more appropriate for ONS to communicate developments. To inform users about the NDQIS programme and its impact on the quality of the police recorded crime statistics, ONS should publish regular updates about current and future developments.

In addition, ONS needs to better document the methods used and the limitations of the tools. ONS has published a one-off methods article about the knife crime tool, but this contains little information about the strengths and limitations of the tool, and it does not cover the tools developed for the other flagged data collections. Several police forces highlighted the lack of publicly available information about NDQIS.

Recommendation 7:

To inform users about the National Data Quality Improvement Service (NDQIS) programme and its impact on the quality of the statistics, ONS should publish and regularly update information about developments and methods, including the strengths and limitations of the tools.

NDQIS is not the only data quality improvement initiative for recorded crime data. A range of other work aimed at improving data quality is happening within police forces and nationally, focusing on the data about the nature and circumstances of the crime, such as person and location data. Annex B gives a few examples of this work, including the crime data quality assessments carried out by the Home Office’s Police National Database team.

The Office for National Statistics

ONS publishes clear information on quality, but it relies on the Home Office to quality assure data

ONS’s quality assurance process is limited to consistency checks

ONS took over responsibility for producing and publishing the police recorded crime statistics from the Home Office in 2012. Responsibility was transferred by the Home Secretary to promote greater public trust and demonstrate the independence of the statistics.

ONS is two steps removed from the underlying police force data. It relies on the Home Office to collate and quality assure the recorded crime data from police forces.

The Home Office sends ONS aggregate-level data. Because ONS does not have access to the record-level data, the quality assurance it applies to the data is limited to consistency checks. These involve comparisons of data between time periods to look for inconsistencies. Unusual changes, such as large increases or decreases in the volume of certain crimes, are raised with the Home Office, which may in turn query them with police forces. This allows ONS to gather information to contextualise and explain the changes to users of the statistics.

In addition to the data, the Home Office submits a quarterly quality report that sets out the checks and quality assurance that the Home Office has carried out. This report includes information about how many forces are live on the Data Hub, which police forces resubmitted data to correct errors discovered during the reconciliation process, and specific issues with forces’ data. Common data issues identified in this report include data provision, undercounts and missing crime outcomes data. ONS told us that the quality report is helpful for writing the quality narrative in the statistical bulletin; for instance, the bulletin highlights which forces were not able to provide data to the Home Office.

The Home Office quality assures the draft statistical bulletin and data tables that ONS produces. It carries out a series of sense checks, format checks and consistency checks to ensure that there are no errors in the data or interpretation of the data. ONS signs off the final statistics for publication.

ONS told us that it receives more information on quality from the Home Office than it used to, and that communications with the Home Office have improved substantially since the Data Hub was established. These are positive changes. But ONS’s oversight of the data quality remains limited. For instance, ONS told us that it would like to better understand the Home Office quarterly data reconciliation process. Furthermore, like the Home Office, ONS has a limited understanding of how police forces quality assure their recorded crime data. To strengthen its oversight of data quality, ONS should work together more closely with the Home Office and share more knowledge about data quality.


ONS publishes clear information on quality, but it should be expanded to cover all areas of quality

As the statistics producer, it is ONS’s responsibility to publish information on the quality of the police recorded crime data. ONS’s user guide to crime statistics contains clear and detailed information about many aspects of the quality of the statistics, including the accreditation status of the statistics; the roles and responsibilities of the different organisations involved the collection of data and the compilation of the statistics; crime recording standards and practices; and the Home Office’s and ONS’s own quality assurance arrangements. ONS updates the user guide annually to ensure that the information remains relevant for users.

However, there are gaps in the quality information. In particular, ONS does not provide sufficient assurance for users about police forces’ quality assurance arrangements and the strengths and limitations of different crime recording IT systems used by police forces. As explained earlier, we want the Home Office to strengthen its oversight of these areas of police force data quality. As the Home Office builds its understanding of police force quality assurance arrangements and crime recording IT systems, ONS should communicate this understanding to users, to give them a full picture of police force data quality.

ONS should explain the main risks to police force data quality, how these are mitigated by police forces, and any impact on the statistics. ONS may want to consult the Scottish Government’s User Guide to Recorded Crime Statistics in Scotland as an example of proportionate information about police force quality assurance arrangements that is structured by the QAAD practice areas. The information about systems should cover the nature of any systems issues, such as those experienced by Greater Manchester Police and Devon and Cornwall Police, and the actions police forces are taking to resolve them.

In addition, ONS needs to explain the specific changes that police forces have made to improve their crime recording standards. While the crime statistics bulletins and user guide contain prominent caveats about crime recording improvements since 2015 and their impact on the number of crimes recorded by the police, these improvements should be contextualised. It is not good enough to say that ‘crime recording processes and practices have improved’.

Recommendation 8:

To communicate, and assure users about, all aspects of the quality of police recorded crime data, ONS should expand its published information on quality to cover:

  • police forces’ quality assurance arrangements.

  • the strengths and limitations of the different crime recording IT systems used by police forces.

  • the nature of crime recording improvements made by police forces since 2014.

To further support user understanding, it would be helpful if ONS included up-to-date and more-accessible information about HMICFRS’s inspections of crime data integrity in its user guide. The user guide contains a table with the crime recording accuracy of all police forces inspected in the 2016 to 2020 programme of crime data integrity inspections, but if users are not explicitly directed towards it, they are likely to miss it. Also, the table does not contain the latest inspection findings, and therefore does not reflect the current state of crime recording accuracy across police forces.

In the last few years, ONS has developed a new data quality framework to inform users about the quality of the crime statistics for different offence types, and which of the two sources (police recorded crime statistics or the Crime Survey for England and Wales statistics) is thought to provide the most reliable measure. ONS publishes reliability ratings for 30 offence types in its user guide, with police recorded crime being the preferred source for 20 offence types (see Annex C for a full list). Of those twenty, only three have a ‘good’ reliability rating – homicide, robbery (crimes against individuals and households) and robbery (crimes against businesses and organisations). The rest are rated ‘moderate’ (11 offence types) or ‘poor’ (six offence types).

We welcome this open assessment of quality of the statistics. However, ONS needs to better explain the criteria it uses to determine the reliability ratings and how often it reviews the ratings. In addition, we encourage ONS to make this information more prominent, for instance, by including it in the Crime in England and Wales statistical bulletin, to help users interpret trends in certain crime types. It is good that ONS included the reliability ratings in its Crime trends in England and Wales article.

Recommendation 9:

To enhance the value of quality information, ONS should explain the data quality framework it uses to assess the reliability of police recorded crime statistics for different offence types.


ONS evaluates the consistency and comparability of police recorded crime statistics with other crime statistics

Comparisons with Crime Survey for England and Wales (CSEW) statistics

Comparisons between the police recorded crime statistics and CSEW statistics can reveal disparities in trends in both data sources and data quality issues. To compare trends, ONS creates a comparable subset of crimes recorded by the police and those measured by the CSEW and calculates the ratio between the volume of crimes.

These comparisons have been particularly helpful for interpreting trends following periods of change to police crime recording practices and processes. For instance, a 2013 analysis of trends in comparable crime identified a divergence between the police recorded crime statistics and CSEW statistics. ONS found that, between 2006 to 2007 and 2011 to 2012, the ratio of police recorded crimes to CSEW crimes decreased year on year, from 0.87 to 0.70. This suggested that police in England and Wales only recorded around 70% of crimes as those captured in the CSEW. ONS hypothesised that the divergence may have been due to a decline in crime recording standards across police forces. ONS’s analysis was a key piece of evidence that supported our decision to remove the National Statistics accreditation for the police recorded crime statistics for England and Wales.

ONS repeated the comparative analysis in 2023, using data for the year ending March 2023. The analysis again identified a divergence between police recorded crime statistics and CSEW statistics, but in the opposite direction. ONS found that the ratio of police recorded crimes to CSEW crimes has increased over time – from 0.68 in the year ending March 2013, to 1.32 in the year ending March 2018, to 1.93 in the year ending March 2023. This suggests that police in England and Wales are now recording roughly twice as many crimes as those captured in the CSEW.

ONS has considered several possible reasons for the divergence, such as the impact of third-party reporting of crimes to the police, introduced in 2015, which may not be covered by the CSEW. It concluded that the increased focus on improving crime recording by police forces is likely to have had an effect, but that it is unlikely to fully explain the divergence. ONS is currently carrying out further work on the divergence, which is looking at other factors, such as the potential impact of lower response rates in the CSEW since the return to face-to-face interviewing after the Covid-19 pandemic. Another possible factor is over-recording of crime by police forces, which we discussed earlier in this report.

It is important that ONS determines the nature and drivers of the divergence, as it may provide insight on quality issues with both data sources. HMICFRS should support ONS in this work by carrying out and sharing analysis of the scale of crime over-recording by police forces, as a possible contributor to the divergence.

Recommendation 10:

ONS should work closely with HMICFRS, the Home Office and, where necessary, police forces, to establish the drivers of the divergence between the police recorded crime statistics and Crime Survey for England and Wales statistics.

Comparisons with other data sources

For certain crime types, ONS validates trends in the police recorded crime statistics with other data sources. For instance, ONS compares trends in the number of offences involving knives or sharp instruments recorded by police with trends in NHS provisional data on hospital admissions. ONS uses the Hospital Episode Statistics for England, published by NHS Digital, and the Patient Episode Database for Wales, published by Digital Health and Care Wales for this. Short-term trends in the two sets of statistics do not align. For instance, in the year ending December 2023, there was a 7% increase in knife-enabled crime recorded by the police compared with the year ending December 2022, whereas the number of admissions for assault by a sharp object decreased by 2% over the same period.

It would be helpful for users if ONS explained what these discrepancies say about the quality of the police recorded crime statistics. Also, we encourage ONS to be clear about the limitations of the hospital admissions data.

Academic researchers have also used hospital data to validate trends in the police recorded crime statistics and CSEW statistics. For example, Cardiff University’s Violence Research Group publishes a very helpful annual Serious violence in England and Wales report that compares emergency departments data on violence-related attendances with ONS’s statistics on violent crime. The latest report, using data for 2023, found that short- and long-term trends in violence-related attendances are broadly similar to trends in the CSEW violence estimates. However, trends in violence-related attendances and police recorded violence statistics do not align. Such analyses can provide independent evidence on the quality and value of police recorded crime statistics.

Back to top
Download PDF version (710.89 KB)