So, in shaping designation to meet the challenges of a data abundant world, we are proposing to demystify the difference between designated statistics and other official statistics.
We are looking to have a regulatory approach that is in line with the requirements of the Statistics and Registration Service Act in focusing on compliance of official statistics with the Code.
And given the massive breadth of available data, to make it unmistakeable for users to know which are official statistics and of these which have been independently reviewed.
We have some specific proposals to simplify the communication around official statistics, to:
- modify the badge for designated statistics to focus on the tick, together with a note to flag the date of the designation and its meaning
- ask producers to include a statement on all official statistics to make clear that they are regulated by the Office for Statistics Regulation
- rename ‘Experimental Statistics’ as ‘Official Statistics in development’
- when removing the designation, both remove the tick and explain in what way the official statistics no longer meet the specific Code pillar
We also see opportunities in five areas:
- Producer process and culture
- Producer quality assessment
- Assessment process
- Designation communication
- Designation inventory
While the designation is the outward sign that the statistics have been verified as complying with the Code, it is not the starting point. When focusing on applying the Code in official statistics production, we suggest an ABC of designation as a simple reminder:
- apply the Code
- be sure by checking through self-evaluation and peer review
- then confirm through designation – the outward sign for users that the statistics have been independently reviewed as complying with the Code
Without all three stages, producers may miss out on both enhancing their practice and building the confidence of their users (see annex 1 for more information).
Organisations can also gain by embedding a culture of Trustworthiness, Quality and Value. That means not just seeing the Code as applying to the organisation’s official statistics but building a shared understanding and set of core values that are in line with the Code. We suggest there are organisational benefits in applying the Code pillars in an integrated and coherent way, as well as by sharing the learning and expertise among analysts, teams and across the organisation, such as with policy, communications and operational colleagues, and with other partner organisations.
We offer the Code Maturity Model as an optional organisational approach to applying the Code (see annex 2 for more information). Its strength is in enabling a strategic way of integrating activities across the organisation under the Code framework. This means recognising how each area contributes to meeting the Code pillars, to gain more in combination. The central message is the power of self-evaluation to inform practice both individually and organisationally.
Users told us that the information of particular interest to them was about the quality of the statistics. Despite often large amounts of information given by producers, users said that it is hard to find the information they want and can be harder still to get a sense of the level of quality. To help address this challenge, we have devised a tool (see annex 3) to assist producers in giving a summary of their view. It is just a prototype at this stage and is going through further development and testing but we think it has promise. It uses a three-star rating scheme and focuses on some key indicators of quality. If you are interested in being involved in the development, please do let us know (email firstname.lastname@example.org).
Our ambition for more comprehensively reassuring users about the compliance of official statistics with the Code of Practice raises the potential for larger numbers of assessments to be conducted to designate statistics. It also emphasises the need for us to establish an effective and comprehensive means for public engagement to inform our thinking, as well as our regulatory judgements.
We are looking to ensure our regulatory approach is both proportionate and consistent with the Statistics and Registration Service Act, to give reassurance of the standards being achieved and kept. We have adapted our assessment process over the years, partly responding to feedback but also embedding our philosophy of trustworthiness, quality and value. Most recently we have seen through the pandemic that we can work in a flexible and pragmatic way and keep true to our regulatory standards. One development we introduced to respond to the new statistics published by producers was our rapid reviews. This is a useful model for us to consider as we adapt our assessment process further.
We also believe that there is an opportunity for us to undertake more deep-dive reviews, such as our quality review in HMRC, with an organisational focus, as well as our programme of systemic reviews which look across producers on particular topics. Other areas that are ripe for this kind of approach include public engagement, and statistical methods where we can partner with external and GSS experts.
So, a significant area of development for OSR will be in how to implement a new model of assessment and designation once approved by the Authority Board. We would welcome hearing from users and producers on their views about a regulatory approach that reflects the requirements of the Act to ensure official statistics comply with the Code.
This area reflects how to clearly present the status of designated statistics following confirmation of compliance. It not only involves the labelling and any related symbol but also how OSR summarises its findings such as any improvement areas. It also involves how designated statistics are easily recognised by users, including the meaning of the designation and its date. This is information that needs to be understandable in a variety of settings and outputs such as dashboards, tabular outputs, online narratives and detailed reports. Solutions that are flexible and adaptable are needed.
A useful but perhaps underrated tool published by OSR is the National Statistics List. We see great potential for further developing this inventory to provide a comprehensive collation of the statistics reviewed by OSR. We will continue to maintain the designated statistics list, updating it with details about the confirmation of continued compliance. We also propose listing other statistics that we have reviewed during the full range of our regulatory activities.