Demonstrating Accountability


We wanted to get a fuller understanding about the nature of the application of the Code within producer bodies and across the statistics system and how the NS designation fits within this.

We asked: how can statistics producers show the public that they are applying high professional standards, that the public’s trust is merited?

  1. Do you apply the Code of Practice standards similarly to both NS and OS?
  2. How does the designation make a difference in your statistical practice?
  3. How does the designation help or hinder?
  4. How would you like to see the designation of National Statistics be: a) presented, b) used, c) promoted?


Producers told us

The designation doesn’t make a difference in deciding how to apply the Code of Practice for Statistics. Producers said that they apply it equally to their official and National Statistics. Some producers treat all of their published statistics as official statistics; while for others producing a wide range of research, analysis and management information, many have elected to voluntarily apply the Code pillars for these outputs.

Some producers have recently sought NS designation. It tends to be requested by new producers of official statistics or by existing producers for some significant new products. For long-standing producers of official statistics, the designation has become less relevant to their day-to-day work and they saw it more as a legacy from earlier years and out-dated. The process of gaining the designation is seen as resource intensive and not felt to be worth the effort required. Having said that, some producers said the designation was useful in showing that they had been held to account and seen to work to the standards of the Code.

At times the designation was reported as used to protect resource – that since the statistics were designated, they needed to continue. The designation was also seen to hinder innovation. Neither of these perspectives is consistent with the Code of Practice which includes a principle on innovation and improvement. It calls on producers to regularly evaluate, in discussion with users and other stakeholders, whether some statistics should continue to be produced in their current form or even potentially be stopped. The NS designation should mean that these principles and practices are shown rather than the symbol of the designation limiting such practice and constraining development – yet it is not always being understood in this way.

The name of ‘National Statistics’ was described as unclear and unhelpful, not well understood by users and, in some parts of the UK, needlessly divisive. An alternative label would be preferred – some suggesting using ‘official statistics’. Producers said that it was important to them that their official statistics are valued. They were also keen for a clearer way of indicating when statistics are experimental, with some preferring the term ‘statistics in development’. A complication has emerged as some producers are describing some of their non-official statistics as ‘experimental’ – the experimental nature of statistics is not restricted to official statistics, since others such as research outputs, also can go through development.

We queried whether the nature of statistical practice within a producer body can be discerned from the combined body of regulatory judgement through assessment and other reviews of compliance. Producers were positive about this approach, as better reflecting their practice across all their statistics and taking seriously their commitment to applying the Code’s standards.

The designation was also seen as having become irrelevant given the urgent response required to the coronavirus pandemic – the statistics being produced and used on a daily basis were not National Statistics. It typified the nature and benefits of real-time data to support government action and public insight.


What next?

These reflections reveal starkly that the NS designation is not fully serving its purpose as a clear signal of the standards being attained for the most important statistics for a producer. It is not inspiring producers to attain the highest standards of practice. It does serve a useful purpose of guarding against interference and is a protection, but at the same time it is seen as a barrier to innovation and improvement.

We propose to investigate alternative modes of designation, including looking further at the issues around the designation applying to the producer, through an evaluation of their statistics and statistical practice. We will also consider the naming of the statistics being designated and the use of the experimental statistics label. Proposals will be developed and tested with stakeholders.



Our exploratory review on the NS designation showed that the public and users in our focus groups and roundtable discussions expect statistics producers to be held to account. They welcomed the role of the regulator and the scrutiny of National Statistics. There was a strong sense, however, that a public commitment or legal requirement to comply with the Code is not sufficient to reassure them that the standards are being applied.

We asked: How can statistics producers show they are held to account?

  1. How do you show you are accountable?
  2. How do you show that your statistics have been independently assessed and reviewed?
  3. Would you be content to add a statement such as ‘regulated by OSR’ to the top of all official statistics releases?
  4. What would you like to see OSR, as the statistics regulator, do more or do differently?
  5. How can official statistics demonstrate compliance with the Code and their accountability be shown?


Producers told us

As noted in the section on Provenance, producers felt that it was important that they are held to account and that they are open about the reviews that have taken place or about their own breaches of the Code. They found real benefit in having external, independent review of their statistical practices. Improvements had been made to statistics that may not have been made otherwise. Producers said that it was helpful to highlight the role of OSR as their regulator. Producers see the Code as important for all analysts producing official statistics and said it is being applied across their statistical practice.

Some producers highlighted ways in which they reassure users of the governance around statistics production, including statements setting out their adherence to the Code of Practice, as well as the lead statistician being visible, and engaging users to hear their views. They also talked about making clear the type of release – for example, whether National Statistics, official statistics, management information, experimental statistics, as well as highlighting the importance of clarifying about other types of data and analysis from departments such as research, financial and operational information.

Producers saw value in reviews that concentrate on the main principles of the Code and its broad messages. They wanted clarity over how to ensure that the Code is being applied across all official statistics and how it applies, or can be applied, to other data they are handling such as management information. Voluntary application of the Code had been found by some producers to be a useful way to show how the Code pillars guide their statistical practice in these kinds of settings.

An assessment of compliance with all principles and practices of the Code was helpful, to have a deep dive on focused issues, and useful when the learning is transferrable to other statistics. There was a concern about the process if there is inconsistent judgement, particularly felt in the first round of assessments by the UK Statistics Authority; where it can be burdensome for producers; and, where judgements could be influenced by a few dominant user voices. More recent assessments were seen to be more proportionate and constructive, particularly where invited by the producer.

De-designation – with the removal of the NS designation, mainly for reasons associated with the quality of the statistics – was seen to be disproportionate by some producers, as potentially appearing as a broad indictment of a producer’s statistical practice, as opposed to addressing specific issues associated with a particular National Statistic. However, it was also described as a motivator to get investment and rectify issues and can be a useful tool for communicating changes in quality.

Producers identified other ways that they are held to account, with some using peer review, other producers evaluating the effectiveness of the presentation of statistics, as well as reviews by internal audit and external experts. Self-assessment, with producers conducting their own reviews of their application of the Code of Practice, was suggested as a way for raising awareness and skill in meeting the standards of the Code.


What next?

Assessments (including re-assessments of existing NS) always have requirements for areas of improvement, showing at one level that continued full compliance is not a realistic goal. However, the full application of the Code – across all official statistics – by producers is suggested by the comments received in this review. There is a stated commitment on the part of the producers in these discussions that applying the Code matters, that achieving high standards of statistical practice in showing Trustworthiness, Quality and Value is important.

There is an inevitability that statistical practice will vary, as teams change, as data systems evolve or age, as policies develop, as new methods are introduced. A focus on being seen to demonstrate the Code’s pillars, to live up to its standards in all aspects of statistical practice, is a more positive approach to delivering statistics that serve the public good, than a regulatory model that concentrates on failure to achieve full compliance.

We will develop a set of ways to demonstrate the accountability of official statistics producers and the role of OSR as the regulator. A number of suggestions made by producers during this review are helpful in pointing ways forward, including statements of compliance being published by OS producers; prominent markings being given that show a producer is being regulated by OSR; schemes for encouraging or enabling internal and/or external peer review.


Back to top