Assessment of statistics on higher education, further education, apprenticeships and adult community learning in Wales

Published:
24 March 2026
Last updated:
24 March 2026

Findings

Leadership and integrity

In April 2025, Medr recruited a Director for Statistics and Analysis. As the organisation’s lead official for statistics, she is responsible for ensuring the statistics comply with the Code of Practice for Statistics (the Code) and is involved in all decisions around producing and developing statistics. She has established a good working relationship with the Welsh Government’s Chief Statistician and has sought advice on ways of working and on issues such as data sharing.

Because Medr has taken over responsibilities from both the Higher Education Funding Council for Wales (HEFCW) and the Welsh Government, the statistical team is made up a mix of former HEFCW and Welsh Government staff. The role of the former Welsh Government statisticians has remained the same; they continue to be responsible for producing official statistics on further education, apprenticeships and adult learning.

In contrast, the role of former HEFCW analysts has changed; they are now responsible for producing the two higher education statistics. As HEFCW was not an official statistics producer, these analysts have had to learn the relevant processes and practices required for producing official statistics. It is good that the further education, apprenticeship and adult learning statistics team has supported and upskilled the higher education statistics team to help ensure that it is working in line with the Code.

Medr statisticians are linked into the government statistical community in Wales. They attend meetings for all official statistics producers in Wales, and there is a staff rotation agreement in place between Medr, the Welsh Government and the Welsh Revenue Authority.

The statistics teams are adequately resourced. While the further education, apprenticeship and adult learning statistics team has the same number of staff as in the Welsh Government, users told us they feel that the team works in a more proactive and more joined-up way with stakeholders within and outside Medr. Users were complementary about the team’s knowledge and experience and welcomed the continuity from the Welsh Government.

Back to top

Transparency and data management

The Medr statistics homepage includes a basic statistical publication schedule showing the date of the next publication and a link to the most recent publication. Publications are also announced on the Welsh Government’s statistics and research calendar. Medr said it will continue to announce publications through the Welsh Government calendar, as some users may not yet be aware of the transition to Medr and may still search for the statistics on the Welsh Government website. This is a sensible approach. We recommend that Medr adds a statement about the transfer of the statistics to its statistics homepage to highlight the change to users.

Until the Director for Statistics and Analysis joined Medr, the further education, apprenticeship and adult learning and higher education statistics teams led the engagement with Medr’s Senior Leadership Team (SLT). They worked with Medr’s SLT to raise awareness of the Code and Medr’s responsibilities as an official statistics producer, including expectations around pre-release access (PRA) and the use of official statistics in external communications. Medr’s pre-release access policy was developed following a session with SLT on the Code. Medr opted for 24 hours PRA rather the usual five days in Wales, and we heard that there have been no concerns with this approach within Medr. Medr publishes PRA lists for all statistics and reviews the list before every publication.

The statistical bulletins are clear about the status of the statistics. The bulletins for the four accredited official statistics transferred from Welsh Government to Medr draw attention to the removal of their accreditation and explain that OSR will assess Medr’s processes to determine whether the statistics meet the standards for accredited official statistics.

User satisfaction with the timeliness of the statistics varies across the different statistics. Several users we spoke to praised the frequency of the apprenticeships learning programmes started statistics. To support public monitoring and scrutiny of the Welsh Government’s progress towards meeting its target of delivering 100,000 apprenticeships, Medr started publishing monthly management information (MI) on apprenticeship starts in December 2025. These releases include headline figures for all apprenticeship starts and the more rigorous measure of progress towards Welsh Government’s target, and explain the difference between the MI and the official statistics. The series will end in June 2026, at which point the figures will give a reasonable indication of the final position for the Senedd term, which ends in May 2026. It is good that Medr recognised the user need for more-timely apprenticeships figures, and we support its transparent approach to releasing the MI.

The apprenticeships MI releases are presented impartially and objectively, with clear explanations to support users’ interpretation. However, we identified one high-profile instance of misuse of the statistics. In February 2026, a Welsh Government press release and a statement during the First Minister’s questions claimed that the Welsh Government had met its 100,000 apprenticeships target. These claims were based on the all apprenticeship starts measure rather than the more rigorous measure, which showed only around 93,000 apprenticeship starts, and the distinction was not made clear. This approach was not consistent with previous reporting. In response, the chair of the UK Statistics Authority wrote publicly to the First Minister of Wales to highlight the Welsh Government’s lack of transparency and the risk that selective use of data poses to public confidence in the statistics.

Some users of the statistics on consistent performance measures for post-16 learning told us that the statistics are often published later than planned. For example, the 2025 release (covering the 2023/24 academic year) was published in May, later than planned due to delays to finalising the post-16 data collection. The lack of timely consistent performance measures statistics has been raised publicly by others, including by Estyn, the education and training inspectorate for Wales, in its Further Education Early Insights report 2024/25: “…this issue is made more challenging as a result of external delays in the publication of validated post-16 consistent measures data for individual colleges and the sector as a whole”.

We welcome that Medr has published the statistics on consistent performance measures for post-16 learning covering the 2024/25 academic year in March 2026, back in line with the usual publication schedule. We consider that Medr should have been more transparent with users about the delay to the 2023/24 consistent performance measures statistics and the reason for the delay.

Several colleges that are both providers of further education data and users of the statistics on consistent performance measures for post-16 learning told us that the statistics are published too late to be useful for developing quality improvement measures for the following academic year. However, Medr told us that there is limited scope to improve the timeliness of the consistent performance measures statistics beyond the current timescales without increasing the burden on data providers. Because Medr takes a final ‘freeze’ of the Lifelong Learning Wales Record (LLWR) data at the end of December and needs time to quality-assure the data and prepare the statistics, February is the earliest that any statistics based on LLWR data can be published, assuming that the post-16 data collection was also available at the same time as the LLWR data.

There are also issues with the timeliness of the statistics on Welsh language in higher education. The statistics covering the 2022/23 and 2023/24 academic years were published in March 2025 and November 2025, respectively, over a year after the end of reference period. Medr explained that two factors contributed to the delay: the introduction of Jisc’s Data Futures programme in 2023, which led to data quality issues and caused Welsh Government to receive the 2022/23 data late; and the transfer of the statistics from the Welsh Government to Medr. Further detail on these data quality issues is provided later in this report.

Welsh Government had intended to publish the 2022/23 statistics before the transfer, but this did not happen. Medr had to establish its own processes to process the data and produce the statistics, which introduced a further delay to the 2022/23 statistics. Medr expects to publish the statistics covering the 2024/25 academic year in July 2026 and to follow a more-consistent publication schedule going forwards. It is good to see these planned improvements to timeliness, and we encourage Medr to continue exploring ways to further reduce the gap between the reference period and the publication of the Welsh-language statistics.

The publication of the 2025 further education, apprenticeships and community learning bulletin, covering the year 2023/24, was postponed shortly before the publication date due to a quality issue with the data on local authority community learning (see the data quality section for further details). Medr explained the delay in the bulletin, but it was not announced promptly or prominently via Medr’s publication schedule, nor was a reason given. As a result, users had no way of knowing that the statistics had been postponed. In future, Medr should always announce delays promptly and explain the reasons for them.

Medr has done a good job of quickly setting up the processes for accessing data and producing official statistics. The team told us that the initial challenges with the transition from the Welsh Government to Medr were technical, particularly with moving data across from Welsh Government. Medr has a data processing agreement with the Welsh Government that gives it access to all relevant data.

Medr has taken over responsibility for producing official statistics from Welsh Government, but it is not yet the data controller for the LLWR and post-16 data collections. That will change once Medr’s full legislative powers come into effect in April 2026. Although Medr will become the data controller in April 2026, the LLWR data infrastructure and system will still be hosted by the Welsh Government. Discussions between Medr and Welsh Government are ongoing around the longer-term future of the data collections.

 

Back to top

Required policies and statements

The Standards for Official Statistics in the Code of Practice for Statistics edition 3.0, published in November 2025, require statistics producers to publish a number of organisational policy documents and statements relating to official statistics production. Many of these policies and statements are new requirements.

Medr produces some of the required policies and statements. For example, Medr has published a clear statement of compliance with the Code of Practice for Statistics that sets out how the statistics are produced in line with the core principles of Trustworthiness, Quality and Value (TQV). Each bulletin includes its own statement of compliance that outlines the key aspects of TQV. Medr has also published a revisions and corrections to statistics policy.

There are several policies and statements that Medr has not yet published, including a quality management approach, an annual statistical work programme and a public involvement and engagement strategy. As set out in our guidance for producers on moving to Code of Practice for Statistics edition 3.0, we expect producers to publish these documents within six months of launch of Code 3.0. We recognise that Medr is still a relatively new statistics producer and needs more time to develop the remaining policies and statements.

Requirement 1: To be transparent about its approach to collecting data and producing and developing statistics, Medr should publish the following policies and statements by September 2026:

a) Release practice policy

b) Data management approach

c) Quality management approach

d) Annual statistical work programme

e) Public involvement and engagement strategy

Back to top

Data quality

Below, we consider the quality and quality assurance (QA) arrangements for each of the three main data sources used to produce the statistics. We found many strengths, such as the robustness of QA arrangements and the close engagement with data providers. We also identified several areas for improvement, particularly in relation to the completeness of quality information.

Student record data

The implementation of the Data Futures programme has faced significant challenges and has impacted the timing and the quality of the data received by Medr (and previously Welsh Government). As we highlighted in our 2024 compliance review of Jisc’s higher education student statistics, Jisc worked closely with all higher education providers and statutory customers, including those in Wales, to support data submissions. It also rigorously quality-assured data to ensure that the statistics are fit for purpose.

Now that the Data Futures programme is in its third year, higher education providers have adjusted to the new data collection, and the major data quality issues have been addressed. However, the new collection continues to affect some Welsh higher education providers more than others. For example, we heard that the University of South Wales has found it challenging to implement the structure of new data collection and extract data using its IT system. The university was positive about the level of engagement and support it received from Medr and Jisc in addressing data quality issues and submitting data.

More broadly, there is regular engagement between the organisations involved in collecting and processing the student record data. The Medr and Jisc statistics teams meet weekly during data submission. In 2024, Medr set up a student record group with all Welsh higher education providers which meets several times a year to discuss quality issues and changes to the data collection. We endorse this proactive engagement with data providers.

Jisc’s QA arrangements for the student record data are robust, complex and continuously evolving. Jisc explained that there are two stages to the QA process: in-collection QA (data checking and validation carried out during the collection) and post-collection QA (data checking and validation carried out after the collection has finished). Higher education providers must submit and sign off their data by late October or early November. Due to the significant changes to the data model, post-collection QA for the 2022/23 and 2023/24 collections was extensive to allow Jisc to identify, investigate and resolve data quality issues.

Now that the data collection is more established, Jisc is finding fewer quality issues and has moved towards carrying out more in-collection QA. It has recently introduced changes to this stage of the process, including asking providers to share a wider narrative and context around the figures to help assess inconsistencies in the data. In addition, Jisc is part way through a programme to modernise its data publication, which involves building reproducible analytical pipelines (RAPs) to automate QA processes for all Jisc publications, including those based on the student record data. These improvements have enabled Jisc to identify and resolve data quality issues more quickly.

The Welsh-language data collection is part of the main student record data collection. Jisc collects data from all Welsh higher education providers on students’ Welsh language proficiency (in reading and writing) and the proportion of modules taught in Welsh. While Jisc manages the data collection and quality-assures student data, Medr is solely responsible for quality-assuring the Welsh-language element.

There is an additional QA process for the Welsh-language data, which is managed by Medr and carried out via Jisc’s Information Reporting Interface Service (IRIS). IRIS is a tool within Jisc’s data collection system that enables higher education providers to access, analyse and verify their data, particularly for end-of-year monitoring and funding calculations. All higher education providers receive an IRIS file when they submit data and cross-reference it with their internal data to check for inconsistencies. One university told us this process identifies different issues to Jisc’s QA process, such as counts of the list of modules available in the Welsh language.

Overall, users told us they are satisfied with the quality of the Jisc student record data and that the Medr statistics are produced to a level of quality that meets their needs. However, one expert user questioned the accuracy and reliability of data on Welsh-language speaking ability. When we spoke to users, only the statistics covering the 2022/23 collection had been published – the first since the rollout of Data Futures. Changes to the Welsh‑language speaking‑ability questions were particularly challenging for higher education providers to implement and negatively affected the quality of the data collected. It is possible that the ‘newness’ of the statistics based on the new questions may have shaped this user’s views on the quality of the statistics; nevertheless, given the expertise of this user, we gave significant weight to their view.

Medr said it has seen an improvement in the quality of Welsh-language data between the 2022/23 and 2023/24 collections and expects data quality to improve further as the new data collection becomes more established. It is good that it is continuing to monitor data quality and work with data providers to improve quality. However, issues remain. For instance, one university highlighted specific issues with the collection of data on the proportion of Welsh language per module.

Despite assurances from Medr about continued improvement to the quality of Welsh-language data, we consider it is too early to judge whether these statistics meet the quality standards of the Code, and that further improvements are needed before we can make this judgement.

Requirement 2: To ensure public confidence in the Welsh language in higher education statistics, Medr should continue to work with higher education providers to improve data quality by March 2027.

Jisc’s published quality information about the student record data is comprehensive and covers all aspects of the data collection, data quality and QA. Its annually updated quality report provides an overview of data quality and explains the methods used to assess quality. Jisc also publishes a ‘known issues’ log which sets out each issue, whether it has been resolved or is still open and, where relevant, the university it applies to. This has included some Wales-specific quality issues, including those experienced by the University of Bangor and the University of South Wales. The level of transparency about data quality is exemplary.

Because Medr is the only organisation that publishes statistics on Welsh language in higher education, it is responsible for publishing information about the quality and methods of the Welsh-language element of the student record data. The Welsh-language in higher education bulletin and quality report include a helpful summary of the new data collection model and describe the most notable impacts on data quality. They are transparent about the break in the time series, the drop in data quality and the higher education providers that have been most affected by the new data collection. Prominent errors in the data are also noted.

However, the quality report lacks information about Medr’s QA arrangements for the data, including the IRIS process summarised above. Given the ongoing quality issues with the Welsh-language data, it is important that Medr explains to users how it has assured itself about the quality of the data. The quality documentation should also set out how Medr engages with and supports higher education providers to improve the quality of the data.

Requirement 3: To assure users about the quality of the statistics on Welsh language in higher education, Medr should describe its quality assurance arrangements for the statistics by March 2027.

Medr’s students in higher education bulletin and quality report contain limited quality information; they simply direct users to Jisc’s detailed documentation. To make the quality information more accessible and help users understand the key aspects of the data collection and data quality, we recommend that Medr include a summary of the most relevant points from the Jisc documentation, following the approach used for the Welsh-language in higher education bulletin and quality report.

Lifelong Learning Wales Record (LLWR)

The Medr statistics team and LLWR data team work together closely and have constructive relationships with further education, apprenticeship and work-based and community learning providers. For example, the statistics team held one-to-one meetings with learning providers shortly after Medr was established to refresh their understanding of the analytical requirements and Medr’s internal analytical processes. The local authorities we spoke to were positive about the level of engagement from Medr. They told us that the statistics team is supportive, responds quickly to data queries and has helped them improve their understanding of data quality. It is good that the LLWR data team is exploring restarting annual workshops with data providers (which were discontinued) to further strengthen its engagement.

The QA arrangements for LLWR data are robust, proportionate and well established. Data are continuously validated by the LLWR data team, and the statistics team carries out a monthly and annual data reconciliation process, depending on the statistics. For apprenticeships learning programmes started and adult community learning data, Medr takes a monthly extract and produces reports for learning providers. Providers check that the data match their records, and Medr follows up any errors or anomalies with providers.

All LLWR data used in the calculation of performance indicators also go through an annual reconciliation process in December, before the final ‘freeze’ of the data. This involves weekly reviews and close engagement with data providers. The data providers that we spoke to told us the reconciliation process works well and that Medr is clear about the checks it carries out. In addition, the LLWR data team undertakes ad hoc data quality improvement work, for example, when new data fields are introduced that require more-thorough validation.

The LLWR data team encourages providers to run their own data reconciliation throughout the year to identify and resolve errors and mismatches. Some providers, including one the colleges we spoke to, do this, but not all do. Medr recognises that this an area for improvement and is currently engaging with providers individually to help them set up a reconciliation process. We support Medr’s efforts to improve the consistency of data providers’ QA arrangements.

There have been issues with the quality of the local authority and adult community learning statistics. One local authority made Medr aware of substantial errors in its 2023/24 data; it had been submitting records to LLWR that should not have been included. Due to the potential scale of the issue, Medr removed the adult community learning component from the statistics on further education, work-based learning and community learning, and learner outcome measures for apprenticeships and adult community learning. This was a sensible decision, given the uncertainties in the data. Medr investigated whether this issue affected other local authorities and found that the issue was restricted to this one local authority.

Medr later published updated versions of the two bulletins that included the local authority and adult community learning component. Medr worked closely with the affected local authority to identify the correct data for 2023/24. The bulletins are transparent about the nature, scale and impact of this quality issue. The further education, work-based learning and community learning bulletin highlights the break in the time series and advises users to not compare the 2023/24 local authority community learning figures with previous years. The adult community learning statistics (in the learner outcome measures for apprenticeships and adult community learning bulletin) remain comparable over time as the issue had a lower impact.

It is good that Medr is using the 2024/25 data to further assess the quality of community learning data. From April 2026, Medr will be using the LLWR to report and monitor against the delivery of the Community Learning Grant (the source of funding for community learning). This change is expected to lead to improvements in data quality; introducing a funding incentive linked to LLWR data submissions for community and work-based learning providers will help ensure that providers submit accurate and timely data.

Users told us they are satisfied with the quality of the LLWR data and that the statistics are produced to a level of quality that meets their needs. They said the information about the quality of the statistics, including strengths and limitations, is clear and helpful. All bulletins include prominent caveats that help users interpret the statistics, and more-detailed summaries of data sources and data quality are published in the accompanying quality reports.

However, we identified some gaps in the published information about quality and methods. The quality reports for statistics on learner outcomes measures for apprenticeships and adult community learning; further education, work-based learning and community learning; and consistent performance measures for post-16 learning give a brief overview of Medr’s QA arrangements for the LLWR data, but the level of detail varies, and the information is inconsistent in places. The quality reports contain no information about learning providers’ own QA arrangements, or how Medr works with all learning providers to investigate and resolve data quality issues.

In addition, the quality report for the statistics on consistent performance measures for post-16 learning does not give a full account of the methods used to produce the statistics. A couple of users told us that they would like Medr to be more transparent about its methodology and any changes to the methodology. However, the quality report does explain how transfers of learners between programmes are handled.

There are also some gaps in the methods information for the apprenticeships learning programmes started statistics. The methods used to calculate the ‘more rigorous’ measure of apprenticeship programmes started measure, and the methodological differences between apprenticeships data sourced from LLWR and Jisc (for degree apprenticeships), are not fully explained.

Requirement 4: To help users understand all aspects of the quality of statistics based on Lifelong Learning Wales Record (LLWR), Medr should expand its quality and methods information to cover the end-to-end quality assurance process and the methods for producing the statistics on consistent performance measures for post-16 learning and apprenticeships learning programmes started by March 2027.

We recommend that Medr uses our Quality Assurance of Administrative Data (QAAD) framework to review and document QA arrangements. This will help the team identify and summarise the main risks to data quality. Given that the LLWR is used to produce several of the statistics covered by this assessment, Medr may want to consider producing one quality report about LLWR to avoid repeating the same information across multiple reports.

Post-16 data collection and other data sources

Together with LLWR, data from sixth forms (the post-16 data collection) are used to produce the statistics on consistent performance measures for post-16 learning. The Medr statistics team has a constructive relationship with the Welsh Government’s post-16 data collection team. We heard that the two teams are in regular communication throughout the collection and that technical queries are discussed as and when they arise.

There are some differences between the LLWR and post-16 data collections:

  • While there is a submission deadline for the post-16 data collection, the time when the collection is finalised varies from year to year due to some sixth forms submitting data after the end of the calendar year. In contrast, no LLWR data can be submitted after the final ‘freeze’ date in December. Medr told us that late submissions have affected the timeliness of the statistics on consistent performance measures for post-16 learning in recent years. For example, the post-16 data collection for 2023/24 was only finalised in March 2025, which meant that the consistent performance measures statistics for 2023/24 were published later than planned.
  • QA is mostly done through automated data validation built into the DEWi system used by schools and local authorities. Medr told us it works closely with the Welsh Government post-16 team after each collection to review and improve the automated data validation rules for the following year, and that this step is crucial for ensuring the quality of the data. The Welsh Government discusses requirements and specifications with software suppliers.
  • The data supply chain is more complex for the post-16 data collection. Local authorities submit data to DEWi on behalf of schools and are responsible for undertaking data validation and resolving errors with schools. The Welsh Government team primarily works with the local authority data officers, but we heard that there have been occasions when it has spoken directly to schools, for example, if there is a problem submitting data. The level of QA carried out by schools varies.

Like LLWR, there is a data reconciliation process for the post-16 data collection. Local authorities submit data in November, and for the following eight weeks, data are sent back to schools for final sign-off. Any errors and anomalies are highlighted, and schools are asked to address these before local authorities submit finalised data by the cut-off date in January. As highlighted above, this cut-off date occasionally slips.

Overall, we have confidence in the quality of the post-16 data collection, but we are concerned about the timeliness of the data collection and the resulting impact on the timeliness of Medr’s statistics on consistent performance measures for post-16 learning. We think it would be helpful if Medr explained to users that the post-16 collection is one of the limiting factors for producing more-timely consistent performance measures statistics. We recommend that Medr works with the Welsh Government to improve the timeliness of the post-16 data collection.

The Welsh Government publishes some information about the DEWi system and data validation for the post-16 data collection through its technical completion notes and guidance. However, these documents do not explain local authorities’ role in data supply and QA. Moreover, the primary audience of these documents is the data providers (schools and local authorities). There is no general quality information available for users of the statistics; neither the consistent performance measures for post-16 learning quality report nor bulletin includes a summary of quality. This is a significant gap. It is Medr’s responsibility to publish this information, now that it produces the consistent performance measures statistics. As for LLWR data, we encourage Medr to use our QAAD framework to review the end-to-end data process.

Requirement 5: To assure users about the quality of the post-16 data collection, Medr should publish a quality summary that covers all aspects of quality, including quality assurance arrangements, by March 2027.

Medr uses several other data sources to produce subsets of the statistics on consistent performance measures for post-16 learning and learner outcome measures for apprenticeships and adult community learning. The Pupil Level Annual School Census (PLASC) and the Matched Education Dataset (MED) are used to calculate learners’ history of free school meal eligibility, as a proxy for deprivation. For the consistent performance measures statistics, qualification attainment data are taken by matching learners to the Welsh Examinations Database (WED) for learners in sixth forms. Medr does not explain the nature of these data sources and nor give an indication of the quality of the data. If this information is published elsewhere, Medr should signpost users to it.

Coherence of data sources

Several users expressed concerns about differences in the way the LLWR and post-16 collections record data on protected characteristics, including sex and ethnicity. One user told us that this lack of alignment is the ‘biggest issue with data in the tertiary sector in Wales’. In our view, the lack of coherence of sex data is the most problematic. The LLWR records the gender identity of learners, whereas the post-16 data collection records the legal sex of learners. The LLWR introduced an ‘other’ category around ten years ago to categorise learners who do not identify as male or female.

The consistent performance measures for post-16 learning bulletin presents achievement measures by a combination of sex and gender identity of learners. Data from LLWR and the post-16 data collection are combined ‘to provide a full picture of… outcomes between females and males’, and users are told to ‘note the differences in the variable collected when interpreting the analysis’. The statistics team took advice from the Welsh Government’s equalities units before deciding on this approach (the decision was made while the statistics were produced by the Welsh Government).

The bulletin explains how data are collected for each source, the changes to the data collection (for the post-16 data collection), and that the data are not collected consistently. However, neither the bulletin nor the quality report explains how the data are combined – specifically, how learners recorded as ‘other’ in the LLWR are treated. Without this information, it is unclear how users should note the differences when interpreting the statistics, as advised in the bulletin.

Our guidance on collecting and reporting data about sex and gender identity states that producers should be transparent about their methods and provide key information to support users when interpreting the statistics. It says that coherence is especially important for datasets that are linked together, and that producers must be clear on the terminology and definitions being used in their data collections.

Requirement 6: To enhance transparency about methods and support user interpretation of differences in sex and gender identity data in the statistics on consistent performance measures for post-16 learning, Medr should publish information about how it combines gender identity data from the Lifelong Learning Wales Record (LLWR) and sex data from the post-16 data collection and provide clear guidance for users by June 2026.

The Government Statistical Service (GSS)’s Harmonisation Team is currently developing a harmonised standard for sex and a new harmonised standard for gender identity. As set out in the Harmonisation Team work plan, it expects to complete this work by the end of 2026. Medr should continue to monitor these developments, and once the new standards have been published, it should look at what data collection approach is best to meet user needs.

The ethnicity of learners is also collected on a different basis. It is not clear from the description in the consistent performance measures for post-16 learning bulletin that ethnicity data are collected differently for the LLWR and post-16 data collection. LLWR uses the GSS harmonised ethnicity standard, whereas the post-16 data collection has different category breakdowns. As explained in the consistent performance measures quality report, Medr aggregates these up into broad ethnic groups to enable comparison. Medr should outline any additional work it has carried out to align the two datasets.

Quality culture

We found that Medr has a strong quality culture. Statisticians told us they feel confident raising quality concerns with managers. When resolving issues, managers focus on learning lessons from issues and implementing measures to minimise future risks to quality. New team members are trained appropriately to ensure that they understand the required quality standards for the statistics.

While Medr does not yet have a formal process for sharing lessons learned, issues are routinely shared informally across statistics teams. Teams also maintain a breach and issues log that captures both process- and quality-related problems so that previous incidents and lessons are recorded. Desk instructions are updated regularly whenever issues arise, ensuring that improvements are embedded into working practices.

Back to top

User engagement and development plans

We found that Medr maintains regular, open and collaborative engagement with users of the statistics inside and outside of Medr. The users we spoke to were consistently positive about their relationship with the Medr statistics teams. They praised Medr for responding promptly to their requests for specific breakdowns of the data. This user feedback highlights how Medr is putting users at the centre of decision-making and maintaining a dialogue with users, which ensures that the statistics continue to be relevant.

Governance differences with Welsh Government mean that Medr’s internal user base is more data focused. Medr analysts are accountable to a board with a strong interest in data, creating greater demand for timely statistics and analysis. To meet these expectations, Medr has strengthened its internal user engagement. For example, it now provides a six-week ‘forward look’ of upcoming outputs to support planning and prioritisation. Statistics teams engage more proactively with internal users than was the case in Welsh Government, while balancing this increased demand with business‑as‑usual work and external user needs. For example, Medr engages regularly with its Quality Committee and internal teams to understand their analysis needs, producing bespoke dashboards, summary papers of key statistical publications and findings, and a wide range of tailored analysis beyond routine outputs.

The Medr statistics teams also run an internal ‘Know Your Stats’ session after each statistics publication, where the key findings are presented and Medr staff can ask questions about the statistics. These sessions were very well received by the users that we spoke to. They exemplify clear communication with users, helping them understand and interpret statistics appropriately.

Medr’s external user engagement focuses on providing bespoke datasets and analysis for a wide range of stakeholders. Users told us that this supports high‑profile submissions to Senedd committees and helps maintain strong relationships with the Welsh Government, local authorities and other data providers. Routine communication with external users, via email updates after each publication and responding to mailbox enquiries, helps ensure broad awareness of the statistics.

For the Welsh language in higher education statistics, engagement with Coleg Cymraeg Cenedlaethol (the Coleg) – the national body responsible for developing and promoting Welsh-medium and bilingual study and training opportunities across the tertiary sector – is especially important. The Coleg provides statutory advice to Medr regarding the Welsh language. There is an annual data‑sharing agreement in place that underpins ongoing collaborative work, including the development and implementation of a national plan for the Welsh language across the tertiary education sector. The higher education statistics team told us it plans to expand its engagement with external users, including by developing closer links with Universities Wales, the membership body representing Wales’ universities.

As part of its Operational Plan, Medr intends to consult stakeholders on performance indicators for the tertiary education sector, which are expected to be published as official statistics. By actively involving users in the development of the statistics, Medr is ensuring their relevance to changing policy contexts.

There are several policies required by the Code of Practice that Medr has not yet published (see the section on required policies and statements). This includes an annual statistical work programme. Several users told us they would like to see a statistical development plan with clear timelines to help them monitor progress, manage expectations about when developments will be delivered, and facilitate planning. Another specific requirement of the Standards for Official Statistics is to develop and publish a public involvement and engagement strategy. The strategy is a good opportunity for Medr to not only demonstrate its commitment to engaging with users, but also to highlight the breadth of its engagement activities and the wide range of users it works with.

One user suggested that Medr establish a data user group which brings together colleagues from across the sectors that use Medr data. They said such a group existed in the past and was a useful forum for communicating information and discussing ideas. We encourage Medr to consider setting up cross-sector data user group in addition to existing user engagement activities to help it gather diverse perspectives and ensure the statistics meet a broad range of user needs.

 

Back to top

Clarity and comprehensiveness

Many users told us that the presentation of the statistics is clear and informative, and that the statistics meet their needs. The statistics are presented impartially and objectively. In general, the statistical bulletins tell a coherent story and include good data visualisations and helpful advice on using the statistics, which aids user interpretation.

We, and users, identified some specific strengths of individual bulletins that demonstrate the value of the statistics:

a) The interactive dashboard for the apprenticeships learning programmes started statistics was frequently cited by users as a valuable tool, enabling them to quickly locate relevant information. Users appreciate the new analysis that has been added, including apprenticeship starts by deprivation, which offers insights into socio-economic disparities. The dashboard and bulletin allow users to assess progress toward the Welsh Government’s target of creating 100,000 all-age apprenticeships, and introduce a specific measure to monitor apprenticeship starts aligned with this objective.

b) The consistent performance measures for post-16 learning bulletin gives a holistic account of how post-16 education (in both sixth forms and colleges) is performing, both in terms of volume of learners and quality of outcomes. Users were positive about the data visualisations included in the bulletin, particularly the flow diagrams illustrating subsets of students with certain grades.

c) Users like the Welsh language in higher education bulletin’s transparency about comparability issues and distinctions between fluent Welsh speakers and all students.

d) The learner outcome measures for apprenticeships and adult community learning bulletin provides a clear and insightful overview of key outcomes, including success rates and differences by level, sector, deprivation, ethnicity and Welsh-language medium. The statistics help users measure performance against targets and ensure accountability for public funding.

e) The further education, work-based learning and community learning bulletin presents nuanced analysis of quality, participation and trends for diverse user groups, and provides a comprehensive picture of the tertiary education and training landscape in Wales.

The users that we spoke to also identified some potential improvements to the presentation of certain statistics. Several users said that the consistent performance measures for post-16 learning bulletin would benefit from additional explanation of trends, contextual factors and key definitions to support interpretation. One user would like Medr to provide clearer explanations of the differences between apprenticeship measures in the apprenticeships learning programmes started statistics. Making these changes would enhance clarity and reduce the risk of misinterpretation.

We consider that there is scope to develop the students in higher education bulletin to make it more engaging and insightful for users. Several users suggested adding charts, downloadable data tables and other visualisations to make trends easier to interpret. However, the team works to a tight publication schedule, releasing the bulletin as soon as possible after Jisc publishes its core students in higher education statistics, which limits the time available to incorporate additional content.

Medr is continuing to review the format and content of all bulletins, including the balance between data and commentary. This shows Medr’s commitment to continuous improvement and will help ensure outputs are tailored to user needs. Medr should consider the suggestions we heard from users as it develops all statistical bulletins.

Users highlighted some data gaps in the statistics on consistent performance measures for post-16 learning, apprenticeship learning programmes started, and Welsh language in higher education. For the consistent performance measures statistics, users requested better data on the progression and destinations of learners. Our 2020 review of the public value of statistics about post-16 education and skills made a similar recommendation: to continue to develop the destinations statistics to provide more detailed analysis for users. It is good that Medr published the latest statistics on consistent performance measures learner destinations in December 2025, which fills this data gap to some extent. The destinations statistics are not in scope for this assessment.

Similarly, users expressed a desire for more-detailed data on destinations of those who have completed apprenticeships. Again, this echoes the findings of our 2020 review, which recommended that the Welsh Government (which published the statistics at the time) carry out user research to consider what further information they can provide on destinations. We encourage Medr to engage with users to address this data gap.

Users also highlighted gaps in contextual and trend data for the statistics on Welsh language in higher education. They requested measures such as the proportion of students studying through Welsh, relative to the Welsh-speaking population, inclusion of language ability in demographics, and year-on-year changes to track progress – particularly as some providers are expanding provision from a low base. Users also called for clearer differentiation between students studying Welsh as a subject and those with Welsh-medium components. We recommend that Medr fill these data gaps to enhance the relevance of the statistics.

Back to top

Accessibility

In general, the users that we spoke to were positive about the accessibility of the statistics. They said the statistics are prominent on the Medr website and it is straightforward to access them.

However, the functionality of the Medr website is limited. Medr told us it is planning to further develop the publication schedule and introduce statistics landing pages, when resource allows. We agree that this work should be taken forward to enhance the accessibility of the statistics and help users navigate release times and publications. For example, currently, only the most recent release can be accessed via the schedule, and it is not possible to access previous releases. More broadly, Medr should ensure the website complies with government accessibility guidelines.

Requirement 7: To ensure users can access previous releases of the statistics, Medr should prioritise developing statistics landing pages by March 2027.

Medr’s data and analysis homepage brings together information about the different data collections that Medr uses to produce statistics, including the LLWR and post-16 data collections. The page lacks structure and is therefore difficult to navigate. To help users find relevant information, we recommend creating a separate page for each data collection or type of analysis.

All bulletins are published as an accessible PDF only. To enhance the accessibility of the statistics for all audiences, we recommend that Medr works towards introducing an HTML version of all bulletins as soon as practicable. We appreciate that Medr’s resource for web development is limited and that this may take some time.

One academic user highlighted delays in the availability of LLWR record-level data through the SAIL Databank, noting that it took around five years for the dataset to be updated. They also recognised that the data have now been successfully deposited and can be accessed for research purposes. Medr told us that it is committed to updating the SAIL dataset annually, which will support ongoing research and help users maximise the value of the data.

Some further education providers noted the benefit of Medr sharing code, which enabled their IT teams to replicate Medr’s reports internally. Others said it would be helpful for Medr to share its code for producing the figures. We encourage Medr to share code with education providers consistently to support their internal reporting and analytical capability.

Medr publishes many breakdowns of the statistics. For example, for the apprenticeships learning programmes started statistics, Medr provides breakdowns by gender, age, ethnicity, age, ethnicity, disability, sector and region. Various users said that they would like Medr to publish further breakdowns, particularly for the statistics on consistent performance measures for post-16 learning and apprenticeships learning programmes started statistics. For the consistent performance measures statistics, users would like breakdowns by provider level. For the apprenticeships learning programmes started, users would like breakdowns of degree apprenticeships by subject/course. To meet user needs for more-disaggregated data, Medr should explore the feasibility of publishing further breakdowns of the statistics on consistent performance measures and apprenticeships learning programmes started.

Back to top