A lot of business planning and strategic communication focuses on what’s in a strategy or plan. But it can be just as informative to describe what’s left out.
The background is that we launched our 5 year business plan, which describes how we will deliver our part of the UK Statistics Authority’s overall strategy, “Statistics for the Public Good”. Over 180 people attended. Sir David Norgrove (UK Statistics Authority Chair); National Statistician Sir Ian Diamond (who introduced the ONS’s own 5 year business plan); and the new head of UKRI, Dame Ottoline Leyser and I were all speakers at the launch event.
The great thing about the tools to support virtual events – Zoom, Google hangouts – is that you can gather a wider range of questions than is possible at a traditional in-person event. So I want to give a reply to an issue that came up in the chat function at yesterday’s launch. I didn’t get to answer it during the event itself. The issue is about what’s left out. It was asked by a couple of people asked in slightly different ways. It’s a great question. It shines a light on where really important choices have been made.
I’d highlight three choices within OSR’s plan. The first may be rather obvious. OSR is small. At present we number only 40 people, although the business plan envisages some growth in staff numbers. But we’re dwarfed by ONS and the Government analytical professions – several thousand people in total. So one very obvious choice is that we can’t do everything. We can’t review every statistic and dataset published by Government every year. Nor should we – it would be disproportionate and have diminishing marginal value. Instead, we drive improvement through advocating the Code of Practice for Statistics, including to people who do not produce official statistics; encouraging producers to take responsibility for the integrity of their own National Statistics; and through our systemic reviews which look across whole statistical areas. These factors act as force multipliers, and they allow us to target our efforts at the really significant issues.
The second area of choice is that, while we will stand up for the appropriate use of statistics, we have decided that we should not be an arbiter of public debate. It is not our job to do what the media do and what fact checkers do so effectively – to review the quality of evidence and arguments put forward by politicians and others. We are not a commentator or referee. Instead, it is our job to ensure statistics and data published by Government serve the public good, and if not, to drive improvement. We can and do step in where we see a significant risk. But we’ve chosen not to expand our remit into doing constant review of what facts are used by whom. As I say, we have fact checkers in the UK already, and they do a brilliant job.
A further choice is that we do not want to take on a role of reviewing the quality of data and advice provided to decision-makers in Government. Our role is the public life of statistics and data – when they are published and used to support public understanding. Often, the same data are used to guide decision-makers and inform the public, especially in the space of economic statistics. And where data are published, we will consider them within our remit. But we come at this through a public lens. We are not proposing to be an all-purpose regulator of any analysis conducted internally within and for Government.
And there is a further consequence of this public lens. If management information held by Government is used in public debate, we expect it to comply with the same principles that apply to official statistics: equality of access; accessibility; clear explanation of sources; and so on. It’s all data presented by Government; the finer distinctions between categories (transparency reports or management information or social research) may not seem relevant to a member of the public. So if the data are published or used publicly, we expect basic standards of trustworthiness, quality and value to apply.
One final aspect: we haven’t been very specific about what statistics we will look at over the next five years. We have an annual planning process and an annual work programme that we publish. This sets out what we will focus on in the coming year. We chose not to do a detailed map of all our proposed outputs. The pandemic has taught us that we need to be agile and have the capacity to respond to emerging, unforeseen issues.
I’ve focused here on some key judgements we made. The business plan we published yesterday sets out what we do propose to do. I would welcome comments to email@example.com. But instead of recounting all the things that are in the business plan, this blog highlights some of the choices we made.
By making these and similar choices over the next five years, OSR will help to ensure that the UK has statistics that serve the public good.