MR Outline 01/14

Targets, or closely related performance measures, feature in many aspects of formulating and managing public policy. Official statistics, produced from data from administrative systems and other sources, are frequently used to monitor performance against such targets. Whilst targets are intended to stimulate changes in behaviours or activities, their existence can also lead to perverse incentives which could influence the data as recorded or reported. Such impacts may not be fully taken into account by the producers of official statistics and the data used, for instance in measuring Payments by Results programmes, may not necessarily provide good approximations for statistical purposes.

The Code of Practice for Official Statistics requires that statistical producer bodies should “only base statistics on administrative data where the definitions and the concepts are good approximations to those appropriate for statistical purposes”. This review will investigate the use of data and official statistics in monitoring a selection of targets and performance measures in the areas of health, education, welfare and criminal justice, and how targets, and progress towards them, are measured and reported. The review will complement the review on audit of administrative data used in producing official statistics.

This review will serve two main purposes: to raise awareness of the issues and risks arising where statistics and targets come together; and to suggest ways in which both producers and users of these statistics can mitigate the main risks.

Background

Administrative sources are used in the production of statistics in almost every aspect of public policy for performance monitoring and to support public accountability. A well-recognised risk in setting targets is that they can generate perverse incentives and disincentives which distort performance.1 In such instances it is important that users of official statistics (and the underpinning data) be given advice about the context in which the data have been collected, to help them make informed use of the material.

Relevant aspects of the Code(2)

  • Principle 4: Statistical methods should be consistent with scientific principles and internationally recognised best practices, and be fully documented. Quality should be monitored and assured taking account of internationally agreed practices.
  • Principle 8: Official statistics, accompanied by full and frank commentary, should be readily accessible to all users.
  • Protocol 1: Effective user engagement is fundamental both to trust in statistics and securing maximum public value.
  • Protocol 3: Administrative sources should be fully exploited for statistical purposes, subject to adherence to appropriate safeguards.

Process and Methods

The team’s work on this review will involve selecting a range of different sets of statistics, presented as case studies, which primarily, but not exclusively, use administrative source data to monitor targets and / or in payments by results regimes. Using these case studies, the team will:

a) Use desk research to review and assess the presentation of information in statistical releases about the use of the statistics to monitor and present progress towards targets. We shall identify examples of good reporting based upon performance monitoring as well as examples where statistical reports, or accompanying information, lack appropriate explanations about the safeguards used by the statistical producing bodies to avoid distorted or under-reporting.

b) Seek the views of experts and interested parties (including the National Audit Office and the relevant audit bodies of the devolved administrations, the RSS, academics and researchers, and lobby groups) to capture views on the extent to which experts and interested parties believe the statistics used to monitor targets are presented appropriately for all audiences and are of high quality, are reliable and are an accurate and true picture of performance.

c) Collate documentation from producers (such as internal and external quality reviews, user feedback from engagement activities, and development plans for statistics) to establish the producers’ views about the extent to which they have balanced different users’ needs and how they have satisfied themselves that the administrative data they use provide good approximations for statistical purposes.

Timetable

We are currently gathering and reviewing evidence, and plan to report to the Authority Board in summer 2014. Your views We would welcome your views on any of the issues to be covered in this review. Please send any comments to: assessment@statistics.gsi.gov.uk by 14 March 2014 if possible but late responses would still be welcome.


  1. See for instance Goodhart’s Law –“any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes”
  2. https://uksa.statisticsauthority.gov.uk/assessment/code-of-practice/index.html