Methods changes and official statistics: what do producers need to demonstrate and document

All forms of data collection and methods, such as editing and imputation, statistical disclosure control, grossing and weighting, might at some point need to change, to make
improvements or because the existing methods are no longer fit for purpose or, in some cases, affordable. The Code of Practice for Statistics encourages innovation and
continuous improvement, such as taking advantage of developments in technology.

The Code also recognises that sometimes producers may have to take decisions that might affect the quality of the data or the usefulness of the statistics. For example,
reducing a survey sample size due to cost constraints might result in less granular data than users would want.

It is essential to be open and transparent in your approach to changing methods. User engagement to inform decision making and clear communication of what decisions have been taken, and why, are the guiding principles that must apply in these situations. Publishing the outcomes of the change are critical to building confidence.

This note sets out examples of the principles in the Code of Practice that producers need to adhere to in order to remain code compliant. It also includes examples of the kinds of materials that can be used to help document this adherence. An important decision will be on the scale and nature of the changes that are being considered and to be proportionate in applying these principles – you need to determine the materiality of the change. Identify if there are fundamental changes to the methods that could affect the statistics – these will need to be understood and explained to users.


Code principles that apply here:

  • T1 Honesty and integrity
  • T2 Independent decision making and leadership
  • T4 Transparent processes and management
  • T6 Data governance

The change of methods needs to be managed in a trustworthy way. The risk to mitigate here is the perception that decisions have been unduly influenced by political or
commercial pressures. This applies to both the overarching decision to change methods, as well as any consequences that flow from it.

For example, moving a survey from face-to-face to online data collection may require questions to be cut – it is essential that the process to decide what to cut be conducted in a trustworthy way to maintain the confidence of people providing their data and the users of the information.

Ways to show your trustworthiness include:

  • Ensuring that statisticians are responsible for final decisions about planned changes and can voice concerns openly (if they have any)
  • Being transparent about what changes are proposed, why they are deemed necessary and how decisions will be made
  • Being transparent about how you have come to your decisions, how you have involved users and, importantly, explaining what you decided not to do and why
  • Being transparent if there are limitations arising from the change in methods
  • Supporting analysts’ professional development around new methods


Code principles that apply here:

  • Q1 Suitable data sources
  • Q2 Sound methods
  • Q3 Assured quality

The proposed change of method will need to demonstrate that it still yields appropriate quality data. The risks to mitigate here are that the new method cannot deliver data of a suitable quality, or that their quality is unknown. In some instances it might be the case that an improvement in one area might lead to a trade-off, with some aspects perhaps less well measured than previously. Being open and transparent about such decisions and your rationale are essential to helping users understand and accept the changes.

Some steps that producers can take here include:

  • Consulting with methodological experts (such as the GSS Quality Centre and Methodology Advisory Service in the Best Practice and Impact Division of ONS, research agencies, and academics) about the kinds of methods proposed and what impact they are likely to have on data quality
  • Drawing on published literature
  • Conducting pilots, parallel running, and other kinds of testing, ideally enabling the old and new method to be compared directly
  • Publishing results of testing and evaluation, and explaining how the results have informed decisions about methods changes
  • Collaborating with other producer/government departments who have undertaken a similar change, or sharing knowledge with others considering changes


Code principles that apply here:

  • V1 Relevance to users
  • V4 Innovation and improvement

The new methods will still need to yield statistics that meet users’ needs. The risks to mitigate here are that users lose statistics that they need, feel they have been ignored, or
only find out that changes have impacted on statistics after they have happened.

Steps producers can take here include:

  • Understanding who your different users are and establishing ways to engage and speak with them about the proposed change and any impact it might have on the statistics
  • Involving users in decisions about the methods change from an early point in the process so they can have a meaningful contribution
  • Publishing information about what decisions have been taken, and why, as a result of your consultations with users, and what the impact of the change has been on the statistics
  • Considering publishing statistics based on new collection methods as experimental while they are still under development
Back to top
Download PDF version (114.27 KB)