Findings

Developing and testing new designation messaging

Workstream 1 in our designation refresh project examined the communication of key concepts related to the designation of official statistics and OSR’s role. It involved testing alternative labels for ‘National Statistics’, the concepts and language of ‘independent review’, a compliance statement, an accountability statement, an improvement notice, and alternative terms for ‘experimental statistics’. This testing was carried out in partnership with ONS Digital Communications, developing plain language and designing badges for testing.

Testing on the designation messaging was conducted on four groups:

  • OSR – a workshops was held with members of OSR, from a range of roles, to review the test material, and to help develop a range of alternative labels for testing with other groups
  • Users of statistics – six one-to-one interviews were conducted by the ONS user research team
  • External stakeholders – eight one-to-one interviews were conducted by the OSR Policy & Standards lead and OSR statistical regulators. These stakeholders are known to OSR from our work and have an active interest in official statistics
  • Analysts workshop – 21 analysts from a range of government departments participated in four discussion groups, running through the test material with OSR regulators

We subsequently translated the main phrases into Welsh and tested them with Welsh language speakers in ONS and Welsh Government (see Annex).

During each section of the testing, upon showing a phrase or statement, we asked participants the questions below, as well as inviting any general reflections:

a. What do you understand from this statement?

b. Are there any parts of the text you don’t understand or where you need more information?

c. How does this statement make you feel about government statistics?

d. How confident or comfortable would you be using statistics with this statement?

The rest of this report sets out the material that was tested, our results and our conclusions.

Clarifying the meaning of ‘National Statistics’

Our designation review found that the label ‘National Statistics’ is unclear and can be interpreted to mean different things by different users of official statistics. We tested a range of items to see what wording would more clearly convey to stakeholders that National Statistics status refers to a sub-set of official statistics that have been subject to further assurance:

  1. OSR accredited statistics
  2. Certified official statistics
  3. Code accredited statistics
  4. Accredited official statistics
  5. Accredited government statistics
  6. Accredited statistics
  7. National Statistics
  8. Other – please feel free to suggest alternative labels

“Official” and “accredited” were popular word choices. They were found to be clear and built confidence. Analysts, users and stakeholders said they felt it was important to emphasise the connection with official statistics. They also said that it was good to show that a body is checking on producers.

Conclusion 1: We can clarify the meaning of the designation for users and other stakeholders by saying National Statistics are accredited official statistics on the OSR and UKSA websites and in all designated statistics.

 

Designation language

In our ideas paper, ‘How to make the designation useful’, we introduced the concept of ‘independent review’ to help clarify the meaning of the National Statistics designation, and OSR’s role in the process:

‘Designated statistics have been independently reviewed as serving the public good’

This was in response to the findings in the designation review that it would be helpful to reiterate the meaning of designation and to emphasise its purpose as a signal of compliance, to protect against interference; and, to inspire producers to improve their practice and enhance the public value of their statistics.

To ensure that the language of ‘independent review’ is appropriately understood by stakeholders and effectively explains OSR’s regulatory role, we tested a short statement ((a) below) and a longer version (b) (the material requested in italics would be added in, tailored to reflect the specific situations):

a. “The Office for Statistics Regulation (OSR) independently reviews whether official statistics comply with the Code of Practice for Statistics and show trustworthiness, quality and value.”

b. “The Office for Statistics Regulation (OSR) independently reviews whether official statistics comply with the Code of Practice for Statistics and show trustworthiness, quality and value.

We do this through a tailored assessment process [link to OSR page explaining assessment process] where we evaluate the extent to which statistics meet the standards set out in the Code. If, following our independent review, we judge that official statistics comply with the Code, they meet the legal requirement to be classified as National Statistics and we give them the following text/badge. [to be inserted]

The Code sets the standards for producing all official statistics at every stage of development. Chief Statisticians and Heads of Profession for Statistics ensure that their analysts apply the Code to all official statistics and promote the sharing of good practice.”

Participants found both statements to be helpful in clarifying the role of OSR and were reassured to know that there is a statistics regulator. Participants strongly endorsed emphasising the independent nature of our reviews.

Conclusion 2: We can simplify our explanation of OSR’s regulatory role in confirming Code compliance by referring to ‘independent review’.

Compliance statements

Subject to any feedback we receive on this findings report, we anticipate that we will wish to adopt the language of ‘independent review’ when explaining our regulatory decision to confirm the National Statistics designation/accreditation for official statistics going forward. Correspondingly, we tested the following statement with testing participants:

“These official statistics were independently reviewed by the Office for Statistics Regulation in [month/year]. They comply with the Code of Practice for Statistics, showing trustworthiness, quality and value.”

Participants were positive about this compliance statement. They emphasised that it would be important to have the link to the Code of Practice for Statistics website so people could see for themselves what the standards are. Some also emphasised the importance of further re-testing of compliance as this is a decision at one point in time. We will ensure our assessment process is strengthened to monitor continuing compliance. We are also reviewing our National Statistics list so that it communicates the range of reviews that OSR has conducted on official statistics more clearly.

Conclusion 3a: Using the language of independent review in the compliance statement reassures users about Code compliance by producers.

We were also keen to see whether users would be reassured if producers provided their own statement confirming that they have applied the Code of Practice for Statistics in the production of their official statistics. We tested this statement, which would be included by producers in their non-accredited official statistics:

“We work to the standards of the Code of Practice for Statistics. We have reviewed these official statistics through self-evaluation/peer review [delete as appropriate]. We are awaiting independent review by the Office for Statistics Regulation [include only if a review is agreed and planned with OSR].”

This statement received some negative feedback, with participants somewhat sceptical and wanting more information about the nature of the self-assessment and peer reviews. They asked for producers to provide links to information setting out how they had reviewed the application of the Code.

Conclusion 3b: Users were not reassured by the proposed statement for commitment to compliance for non-accredited official statistics – they require evidence of how producers apply the Code.

We propose that OSR provide guidance to producers on publishing their own statements of compliance, including describing their approach to applying the Code pillars, trustworthiness, quality and value, and evidence of the steps they are taking to improve their statistics.

Improvement notice

Our ideas paper in October 2021 emphasised the importance to users of being clear about when and why the designation has been removed from an official statistics/set of official statistics. To meet this need we tested this proposed statement for flagging where OSR had concluded that the Code is no longer being met in some specific area (equivalent to de-designation):

“These statistics were independently reviewed by the Office for Statistics Regulation (OSR) in xx/202x and do not currently comply with the Code of Practice for Statistics. The main areas that need to be addressed are…. [to be described by the producer as relevant]”

The statement received a mixed reaction. It was described as somewhat passive and negative by analysts. The users interviewed were more in favour of the statement but felt that it needs to be clear that statistics are good enough to use but need improvement. It could be a more active statement with producers setting out what they are doing to improve and be clear about limitations. It would help if the statement were tailored to individual situations.

While we appreciate the negative nature of the statement, the improvement notice is expected to be used rarely and in situations in which users should be alerted to significant areas of non-compliance.

Conclusion 4: We propose an adapted version of the improvement notice that sets out the issues and how they will be addressed will be helpful to users.

Reflecting the feedback from testing, we propose that the following statement is included on statistics when OSR has concluded the Code is no longer being met:

“The Office for Statistics Regulation (OSR) suspended accreditation of the [name of statistics] on [month/year], having concluded that they no longer comply with the Code of Practice for Statistics. The main areas to be addressed are…. OSR will review the statistics once these areas have been addressed to reassess compliance with the Code.”

[OSR will highlight the areas to be addressed in its letter to the head of profession for statistics. The producer should describe its activities to address the problems, with an action plan, in the statement alongside the statistics]

We will consider in our assessment refresh project how we review and report on the progress made by the producer.

Accountability statement

Our early exploratory review published in March 2020 proposed that it would be beneficial to have a statement shown by producers, which makes it clear that they are accountable to a regulator. The importance of demonstrating accountability was reinforced in our subsequent designation review consultation findings. We tested this proposed accountability statement:

Short version:

“We are regulated by the Office for Statistics Regulation”

Fuller version:

“We are regulated by the Office for Statistics Regulation (OSR). OSR sets the standards of trustworthiness, quality and value in the Code of Practice for Statistics that all producers of official statistics should adhere to.”

Participants were positive about the proposed statement, preferring the fuller version. It was highlighted that the statement should be specific on the accountability relating to the production of statistics, rather than to all areas of the organisation’s activities. An explanation of what it means to be regulated by OSR should be easy for people to find on our website, such as, how often we verify a producer complies with the Code.

Conclusion 5: The accountability statement provides reassurance to users that producers are held to account.

We will encourage producers to provide the accountability statement with a link to the OSR website. OSR will provide users a straightforward explanation of our independent review approach. We have amended the statement to reflect that the remit is statistical practice. We are also proposing that producers provide additional guidance on how to raise issues about standards of practice:

“Our statistical practice is regulated by the Office for Statistics Regulation (OSR). OSR sets the standards of trustworthiness, quality and value in the Code of Practice for Statistics that all producers of official statistics should adhere to.

We will do our best to respond to any comments about how we meet these standards. If you have any remaining concerns, you are welcome to contact OSR by emailing regulation@statistics.gov.uk or via the OSR website.”

Experimental statistics

Our exploratory review report recommended that the term ‘experimental statistics’ should be changed to a clearer label. This view was supported in the designation review consultation findings. In this test we asked participants their view on which of a range of terms they found to be clearest:

a. Proposed definition:

“These official statistics are newly developed or going through development and testing with users. This is to ensure they meet user needs and quality standards in line with the Code of Practice for Statistics.”

b. List of possible labels:

  1. “Official statistics in development”
  2. “Official statistics under development”
  3. “Official statistics in testing”
  4. “Statistics in development”
  5. “Statistics under development”
  6. “Statistics in testing”
  7. “Experimental statistics”
  8. Other – please feel free to suggest an alternative

Participants were supportive of changing the experimental statistics label and found the definition clear, although with some minor grammatical improvements to the first sentence. They tended to prefer including ‘official’ in the label, and ‘development’ to ‘testing’. It was suggested that more information is given by producers on what is being done to develop the statistics. Analysts asked for more guidance on how and when to use the label. There was a strong view that the label should not be used to flag poor quality.

Conclusion 6: Users found the label ‘official statistics in development’ clear in signalling this key aspect of the statistics. 

Reflecting the feedback from testing, we propose this revised statement:

“These official statistics are in development and will be tested with users, in line with the standards of trustworthiness, quality and value in the Code of Practice for Statistics.”

We are working jointly with the GSS on reviewing the guidance on official statistics in development. It sets out when and how producers should use the label. It also emphasises explaining to users about the plan for the development and showing how they can be involved in testing.

Summary of conclusions

1. Clarifying the meaning of ‘National Statistics’:

We can clarify the meaning of the designation for the public by saying ‘National Statistics are accredited official statistics’ on the OSR and UKSA websites and in all designated statistics.

2. Independent review:

We can simplify our explanation of our regulatory role in confirming Code compliance by referring to ‘independent review’.

3. Compliance statements:

Using the language of independent review in the compliance statement reassures users about the Code compliance by producers. Users were not reassured by the proposed statement for commitment to compliance for non-accredited official statistics – they require evidence of how producers apply the Code.

4. Improvement notice:

An adapted version of the improvement notice that sets out the issues and how they will be addressed will be helpful to users.

5. Accountability statement:

The accountability statement provides reassurance to users that producers are held to account.

6. Experimental statistics:

Users found the label ‘official statistics in development’ clear in signalling this key aspect of the statistics.

 

Back to top
Download PDF version (233.04 KB)