Making Sense of the Code

We at OSR are often focused on all things government – how and why departments produce and publish official statistics and then use them in the public domain. It’s why OSR exists and why there is that one lovely book to rule us all – the Code of Practice for Statistics.

But what does the public think about official statistics? And the Code? It matters because the value of official statistics is that they provide everyone with the information needed to understand and improve our lives and society. Thus, we partnered with Sense About Science, an independent campaigning charity that challenges the misrepresentation of science and evidence in public life, to meet with nine members of the public to hear their thoughts.

The focus group participants ranged from a PhD student and a food blogger to a retired GP and a Comms officer at a charity. They were chatty, engaged, curious and challenging – in other words, fantastic company with which to discuss statistics.

We asked them if they trusted statistics. ‘It depends’ was the common response – depends on where they read the statistical claim, the agenda, what’s at stake, has it been peer reviewed, and what’s the source.

We asked if they had seen bad examples of statistics and all nodded vigorously.

We moved on to the topic of the Code, asking what it is and why they think it exists. Participants consider it a mechanism for quality control and best practice. They think it exists ‘for public confidence and trust,’ ‘to hold people to account,’ and ‘an organisation who signs up is effectively saying they’ve nothing to hide.’

Interestingly, one user did not see the value of the National Statistics ‘kite mark’, stating that the statistics themselves might be perfect . . . but the interpretation and spin put on them is the problem.

The focus group participants reviewed the pillars (Trustworthiness, Quality and Value) and principles for each within the refreshed Code. Overall they found the pillars a bit confusing and that Value, in particular, required further explanation.

Some words and concepts throughout needed unpacking like ‘robust.’ Fair enough. Robust is a word woefully overused in the civil service and what exactly does it mean: solid; good; meaningful?

They also felt the idea of ‘public interest’ required clarification. For example, sometimes ‘the public is interested’ is conflated with ‘the public interest’.  One user suggested ‘national interest’ or ‘public good’ were more meaningful terms.

All in all, a great experience for us.

To hear that in general the nine panelists do trust statistics and question what they read and expect government to follow the Code was insightful.  Their comments were invaluable and will result in a clearer document.

Thank you to Sense About Science for facilitating and to those in the focus group.