What is it you think of when you come across some new information on social media? If you are like me and a bit of a sceptic (or should that read ‘sensibly cautious’?), the first thought is ‘What’s the source? Is it someone I trust?’

That can be the same with buying food in the supermarket. I’ll probably look at the packaging first and decide if it is a company I like. Then I might look at the ingredients and decide if I am OK with the amount of fat, sugar and additives, and see if it is giving me the nutrition I am needing.

So, to start with I am thinking about the provenance of the product, and my second step is to consider the fitness of the product to meet my needs.

And we need to take the same approach when we think about whether to use some data – find out about provenance and fitness for purpose.

In deciding the suitability of the source, you will want to think about:

  • Who collected the data?
  • Is it a trustworthy organisation?
  • Is it authoritative and independent?
  • Is it open and transparent?

We often take short cuts when we are making decisions – we look for clues to help us make quick choices. One of them can be recognising a familiar brand. In our recent exploratory review looking into the designation of National Statistics we asked two focus groups about how they decide whether to use some statistics – what influences their choice? It was interesting to see how they responded to statistics published by the Office for National Statistics or by other crown bodies such as government departments. Their immediate reaction was to trust the data. In fact, they said the logo of the crown itself would make a good logo for official statistics. It was something they recognised and gave them confidence.

The brand of an organisation publishing statistics can have a powerful impact on someone deciding to use the data. But it is important to not just rely on their brand – discerning users need easy access to information that helps show whether the organisation is trustworthy.

Users may quickly decide that the source is one they have confidence in – the critical information they want to know is about the quality of the data. Some questions they should ask to decide if the data are fit for a particular purpose are:

  • What are the data characteristics?
  • Are the data in their raw form or adjusted?
  • How representative and reliable are the data?
  • How well do the data match the concept I want to measure?
  • How relevant are the data to my purpose?

Again, prominent information that helps users understand quality is central to making a choice to use the data. The publisher of the statistics needs to simply explain about the strengths and limitations, to help the users decide whether the statistics are the right ones for them. And if the data end up being the only ones that are anywhere close to what they want to measure, this helps the users have a good appreciation about the limits of what the data can tell them. Oftentimes not just anything will do. Never forget – garbage in, garbage out!

Provenance and fitness for purpose rely on the trustworthiness of the producer and the ways that they work, the quality of the data and soundness of methods, and on the efforts taken to ensure the relevancy of the statistics. In OSR we summarise these as TrustworthinessQuality and Value – the three pillars of the Code of Practice for Statistics.

The Code sets the professional standards for all official statisticians – it is why users can have confidence that the statistics produced in government departments are independent of political interference. It is our job as the regulator to hold the producers of official statistics to account.

But what about non-official statistics? Do the same standards apply? In short – they can. We introduced a scheme in 2018 alongside the publication of the Code of Practice that we call ‘voluntary application’. It is where an organisation releasing non-official data elects to hold themselves to the three pillars of the Code of Practice.

We ask them to make a public commitment. They do this by setting out how they show their adherence to Trustworthiness, Quality and Value – what it looks like in practice – and we recommend they publish a statement of compliance. We maintain a list of adopters of the Code pillars on our Code of Practice website, with a link to the published statements of compliance that we have reviewed.

So far there are 24 sets of statistics listed on our website. And it is growing. We have an active community of practice in which organisations and analysts can share their experiences and learn from each other.

Some of the organisations are crown bodies and release official statistics – these organisations produce a wide range of data and analysis, much of which is not released as official statistics. Voluntarily applying the Code is a powerful way of bringing the same, common-sense standards to all areas of these organisations.

Many organisations in the list are not government departments and do not publish official statistics. The list includes the Financial Conduct Authority, the Universities and Colleges Admissions Service – UCAS, the Scottish Fiscal Commission, the Greater London Authority, Ipsos MORI and the Social Metrics Commission.

Applying the three pillars of Trustworthiness, Quality and Value provides a powerful thinking tool for organisations – it helps them learn together about the pillars, reflect on the nature of their statistical or analytical practices and see how they show the pillars, consider ways of improving or extending their practices, and then make the public commitment to continue to work to these standards. It is a powerful statement – not just of intent, but a determined desire to demonstrate adherence to the standards. In turn, it is a signpost to help users make good choices on whether to use the data.