Effective decisions require reliable information. Management information (MI) is aggregate information collated during the normal course of business to inform operational delivery, policy development or the management of performance.
The Office for Statistics Regulation recognises that quality management actions — investigate; manage; communicate — are ongoing activities, to deliver data and statistics to the required standards of trustworthiness, quality and value.
Timeliness is essential in using MI and so our guidance focuses on a basic level of assurance — being aware of the potential quality issues that may affect data, particularly where collected for operational purposes such as transactions and recordkeeping.
The Office for Statistics Regulation defines four areas of Quality Assurance (QA) practice:
- Operational context and data collection
- Communication with data supply partners
- QA principles, standards and checks applied by data supply partners
- MI producer’s QA investigations and documentation
Here are some examples of the practices you can use for the quality assurance of MI, shown for the three quality management actions:
Investigate | Manage | Communicate |
---|---|---|
Identify implications of collection arrangements for accuracy and quality of MI | Document data provision arrangements | Outline collection process and operational context of MI's source data |
Identify safeguards taken to minimise risks to data quality | Review data needs annually with supply partners and MI users | Seek the views and experiences of MI users about the data |
Identify results of suppliers' QA checks | Agree timing, definitions and format of data supply | Tell data suppliers about identified errors and record their response |
Identify results of audits or inspection conducted on the source data | Coordinate the sign-off of dataset from suppliers | Describe the implications of the suppliers' QA arrangements for producing the MI |
Identify strengths and limitations of the data used to produce MI | Conduct regular QA checks on the MI's source data | Summarise own QA checks on the source data and the likely degree of risk to the quality of MI |
Four practice areas for the QA of data
Gain an understanding of the environment of the data sources and the nature of the collection processes. Look for the factors which might increase the risks to the quality of the data – such as the effects of targets and performance management regimes, the numbers of data collector and supplier bodies, and the information governance arrangements
Effective relationships with suppliers stem from a common understanding. They can be set out in a service level agreement or memoranda of understanding. They can include change management processes, to ensure that MI needs are considered when changes are being made to the collection systems.
Gain an understanding of the validation checks conducted by suppliers and other partners, and find out the results of the checks. These can include operational inspections of the data records and internal or external audit of the processes and systems.
Use consistency and sense checks to consider whether the MI is meaningful and if you can explain changes in trends and discontinuities; for example, any changes in target definitions and their implications for the MI.
It is important to explain to your users about any important implications stemming from data quality issues for your MI or statistics. If you cannot find out answers to your questions, explaining why you are still content with the data is helpful too.
QAMI Questions
The next section proposes a list of questions for each QA practice area. Use them as a prompt to help you find out more about the data sources and any associated quality issues. It includes a list of questions to help you think about the use made of the MI and what your users need to know about quality. These questions have emerged from the work of the Office for Statistics Regulation to assess statistics produced by Government bodies.
1. What data are collected and how?
2. Do suppliers use the same collection and coding systems?
3. What is the implication of the collection process for data quality?
4. Have there been policy changes that have changed the data collection requirements?
5. Does the use of performance measures provide perverse incentives in the collection of the data?
1. Do I understand how other partners (such as official statistics producers) use the data?
2. Am I happy with the consistency and data definitions?
3. How will I know if my data suppliers change their systems or processes? Are there any implications of the changes for the statistics?
4. What are the unmitigated risks to the quality of the data supplied? Can we put any more processes in place to mitigate the risks?
5. How confident do I feel about the effectiveness of the way I communicate with my data suppliers and other partners (including official statistics producers)?
1. Do I understand what quality assurance arrangements my suppliers have in place?
2. How do I know that the data suppliers are applying routinely any quality assurance that is set out in the service level agreement?
3. Is there potential for selection distortion or bias? What are the tolerance levels for errors and outliers?
4. Do particular suppliers have poor track records for completeness or timeliness of supply?
5. Am I aware of any third-party data quality assurance or audit functions that occur during the data collection process?
1. Are data suppliers required to report quality indicators to us? Do we receive the reports and what do they show?
2. What do system validation checks show?
3. Have I sense checked the data and investigated any difference found?
4. Am I checking the right things at the right point? Do all the QA steps still add value?
5. What are the implications for use of the data based on my QA findings?
1. Do our users know why and how the data are collected?
2. What is the best way to communicate the data supply process to our users?
3. Have we explained the risks to data quality to our users?
4. Do we publish MI based on the data?
5. Are the MI used to produce official statistics?
This guidance was moved out of PDF format, onto this webpage on 23 November 2020. No changes were made to the content.