Using the Quality Assurance Matrix

Practice areas

In an Assessment of official statistics based on administrative data, the Authority will consider the quality assurance of the data across the four practice areas outlined in the QA Toolkit. These practice areas demonstrate the need for the quality assurance of statistics obtained from administrative statistics to extend beyond the checks made by statistical producers on the data they receive. The producers should demonstrate knowledge of the operational context in which the data are recorded, and an understanding of the impact that the motivations of data inputters can have on the data.

They should also have good communication links with data supply partners and understand their partners’ data quality processes and standards.

Four practice areas associated with data quality

Operational context and admin data collection

  • environment and processes for compiling the administrative data
  • factors which affect data quality and cause bias
  • safeguards which minimise the risks
  • role of performance measurements and targets; potential for distortive effects

Communication with data supply partners

  • collaborative relationships with data collectors, suppliers, IT specialists, policy and operational officials
  • formal agreements detailing arrangements
  • regular engagement with collectors, suppliers and users

QA principles, standards and checks by data suppliers

  • data assurance arrangements in data collection and supply
  • quality information about the data from suppliers
  • role of operational inspection and internal/external audit in data assurance process

Producers' QA investigations & documentation

  • QA checks carried out by statistics producer
  • quality indicators for input data and output statistics
  • strengths and limitations of the data in relation to use
  • explanation for users about the data quality and impact on the statistics

The Quality Assurance Matrix

The Authority recognises the resource challenges faced by statistical producers and supports a proportionate and pragmatic approach to the assurance of administrative data in relation to each area of practice. The QA Matrix encourages producers to consider the level of data quality issues that could affect their data together with the level of public interest in the statistics.

The QA Matrix is not intended to be used as a check-list or a tick box exercise. Work has been done to develop such lists and they can be useful in developing quality assurance processes. However, over-reliance on a checklist may lead to cases where issues are over-simplified and limit the ongoing development of quality assurance processes. The QA Matrix presents examples of the types of evidence that assessors would expect to see as part of an Assessment, to provide statistical producers with an understanding of the standard required for each level of assurance.

The assurance levels set out in the QA Matrix are not intended to be interpreted as a Red, Amber, Green (RAG) status as commonly used in project management. For example level A3 does not necessarily mean that remedial action is needed; instead it could mean that the statistics have ongoing levels of higher public interest and higher quality concerns which would mean that higher levels of assurance would be continual. Remaining at A3 level of assurance over time does not represent a failing by the statisticians; it simply demonstrates that a higher level of assurance is necessary for a certain set of statistics.

The QA Matrix is not exhaustive. It does not present all the possible types of work that statistical producers could carry out to assure the quality of their data. In many cases they will be undertaking other types of quality assurance work which they should share with assessors. Similarly, statistical producers might recognise that not all elements suggested at a certain level of assurance will be appropriate for their statistics, and have appropriate reasons for making such a judgment. Ultimately it is for each statistical producer to decide how it meets the standard and present this evidence to the Authority.

A key element of practice emphasised in the QA Toolkit is that if flaws are found in administrative data, statistical producers should: evaluate the likely impact on the statistics; establish whether the issue can be resolved, or whether there is any other action they can take to mitigate the risks; and determine whether the level of impact is such that users should be notified. It is recognised that often issues discovered through quality assurance are complex and will require time and staffing and financial resources to address. Statistical producers are required to maintain ongoing compliance with the Code of Practice. If, in the course of these investigations, a statistical producer discovers a systemic issue in the administrative data that has a substantial adverse impact on the statistics, we encourage the statistical Head of Profession to contact the Authority to discuss appropriate action.

The Authority recognises that there are certain circumstances in which regular, systematic external evaluation, audit or inspection of the underlying data is essential to increase both the quality of, and public confidence in, statistics produced from administrative data. For statistics requiring higher levels of assurance, these external evaluations should be regular and repeated. In the absence of regular repeated external scrutiny (for those sets of statistics for which it is appropriate) the statisticians should highlight the deficiency for users and investigate whether there are other sources that could be used.

Administrative data underpinning official statistics can be subject to, or feature in, various kinds of audit, depending on their operational context, for example: financial, clinical, and statistical audit in which a sample of existing cases is investigated. These investigations can be used to provide context, and in some cases corroboration of the data quality. It is good practice to investigate whether these other types of audits could provide information to support the statisticians’ quality judgments, and to use them if appropriate. It is also recognised that in some cases these types of corroborating information are not available.

The steps to be taken by statistical producers need to go beyond a narrow interpretation of ‘quality assurance’; they should also encompass the working arrangements and relationships with other agents, particularly data supply partners. These working arrangements can range from straightforward data transfers from a single data supplier, through to more complex large-scale systems. In addition some official statistics are based on administrative data provided by another statistical producer body.

For statistics based on administrative data obtained from outside the Government Statistical Service some examples include:

  • Directly from the organisation that records the data – in this case the statistical producer should engage with this organisation directly and take into consideration the detail and nature of information received from the data supplier when deciding on appropriate quality assurance. The responsibility for producing information about the quality of the statistics lies with the producer. In the case of large numbers of direct suppliers, the statistical producer might explore other ways of engaging, such as holding meetings to discuss common quality issues and liaising with information governance groups.
  • Via intermediary organisations – for example local authorities collect data and pass these to the producer – in this case the statistical producer should: 1) have an understanding of the entire data cycle and the quality assurance processes carried out by the original data supplier; and 2) engage with intermediary organisations to understand their quality assurance processes and standards.

For statistics based on administrative data collected by another government department or official statistics producer body:

  • In this case both the statistical producer and the supplier department have a responsibility to understand and communicate about the quality of the data. The producer with responsibility for publishing the statistics has the ultimate responsibility for assuring itself about the quality of the underlying data and communicating this to users. However the statistical Head of Profession for the supplier partner body has a responsibility to ensure that it shares its quality assurance information with the statistical producers, including clear information about any limitations or bias in the data, and takes action to address any queries or concerns raised by the statistical producer. In cases where producer bodies share data for statistical purposes, or both work with the same data supply partners, we encourage such producers to work together to develop a better understanding of the quality of the administrative data, sharing intelligence and insight about the data with each other and with users.
Back to top
Download PDF version (298.77 KB)