Annex A – Analysis of user questionnaire
Introduction
The aim of the KS4 user survey was to collect information from a wide range of users to supplement the information that we had gathered from our user interviews.
The KS4 user survey went live on the OSR website on 2 February 2022 and was open until 31 March 2022. The survey consisted of eight questions about the KS4 statistics designed for a range of users. The survey was a non-random sample, and we aimed to gather responses from a wide range of users to ensure that the responses were as broad as possible. The questionnaire was publicised on the OSR twitter account, and the link was included in articles by the Times Educational Supplement (TES) and Schools Week. We also contacted around 170 potential users informing them of the survey.
At the time of closing the survey we had received 108 responses. Following data cleaning eight responses were removed due to being test entries or duplicates, this left 100 responses which we then analysed. This annex presents the number of responses we received for certain questions but does not include potentially identifiable responses.
Question analysis
Question 1 – How do you usually access or view the DfE KS4 performance statistics? (tick all that apply)
The majority of respondents accessed the statistics through the GOV.UK website, a large proportion also read the release and created their own tables through EES.
How do you usually access or view the DfE KS4 performance statistics? | Responses |
---|---|
Gov.uk website | 71 |
DfE website – Explore Education Statistics (read the release) | 59 |
DfE website – Explore Education Statistics (create your own tables) | 50 |
The media | 13 |
Someone else provides them to me | 8 |
Social media | 4 |
Other (please specify) | 12 |
Question 2 – Which of the statistics or datasets do you use?
The majority of respondents used Attainment 8, the headline facts and figures and Progress 8. A large number of respondents also use achievement of grade 5 and above in English and Maths, attainment broken down by personal characteristics, and disadvantage gap.
Which of the statistics or datasets do you use? | Responses |
---|---|
Attainment 8 | 86 |
Headline facts and figures | 85 |
Progress 8 | 83 |
Achievement of grade 5 and above in English and Maths | 77 |
Attainment broken down by personal characteristics | 74 |
Disadvantage gap | 73 |
Local Authority tables | 58 |
EBacc entry and subjects entered | 56 |
Institution level data available through Compare School and College Performance (CSCP) service | 41 |
KS2 to 4 Transition Matrices | 36 |
Underlying data [please specify] | 34 |
Multi-academy trust data | 23 |
Other [please specify] | 8 |
The majority of answers provided for ‘Other’ related to users downloading the underlying data to conduct their own analysis using variables such as Special Educational Needs (SEN), Ethnicity, and Free School Meals.
Question 3a – What do you use the data for?
The majority of respondents use the data to explore the differences in attainment between groups of pupils, understanding trends in educational attainment and comparing schools in a single local authority. A large proportion also used them to compare their own school to others, exploring progress made by pupils between KS2 and KS4, and communicating the statistics to others.
What do you use the data for? | Responses |
---|---|
Exploring differences in attainment between groups of pupils | 78 |
Understanding trends in educational attainment | 69 |
Comparing schools in local authority | 62 |
Comparing own school(s) to others | 59 |
Exploring progress made by pupils between KS2 and KS4 | 56 |
Communicating the statistics to others | 53 |
Comparing attainment between local authorities | 42 |
Further research | 32 |
Making choices about schools | 12 |
Holding your/ your child’s school to account | 10 |
Other [please specify] | 8 |
Question 3b – Please provide further information on your reasons for this use
Users reported a wide range of reasons for using the KS4 data, common reasons included:
- Used by central and local government (including LAs) to inform decision making and monitor progress
- Used by schools to monitor their own performance
- Used to compare schools against other schools and at LA and national level
- Used by companies who provide advice or information to schools
- Used by researchers to conduct their own analysis, for example into inequalities.
Question 4a – How satisfied are you with the following aspects of the statistics?
Respondents gave the highest ratings on average for the level of quality, how clearly the statistics are presented, and impartiality. Respondents gave the lowest ratings on average for the ability to build you own tables and charts, advice about strengths and limitations, and ease of access to raw data for reuse.
How satisfied are you with the following aspects of the statistics? | Average score /10 |
---|---|
Level of Quality | 6.5 |
How clearly are the statistics are presented | 6.5 |
Impartiality | 6.4 |
Use of pre-populated tables and charts | 6.2 |
Ease of access to the statistics | 6.2 |
Description of trends/ patterns (known as commentary) | 5.8 |
Information on the methodology | 5.5 |
Ability to build your own tables and charts | 5.3 |
Advice about strengths and limitations | 5.2 |
Ease of access to raw data for reuse | 5.1 |
Question 4b – Please provide any comments that can help us understand your rating of these aspects of the statistics
Users reported a wide range of reasons for their ratings. Common reasons included:
- Concerns over the publication of 2022 data and whether the limitations will be clear
- The build your own table functionality in EES – some users were in favour of this and felt it gave them more flexibility, others found the platform difficult to use and reported that not all of the data/variables they needed were available on there
- Lack of contextual information for the headline accountability measures including information on other factors that influence academic performance
- Issues with the data
- Understanding of differences between the statistics produced by DfE and those produced by JCQ and Ofqual
- Lack of consistency in the data structure and variable names year on year in datasets
- Lack of clarity and availability of trust level and Alternative Provider school data
- Several users expressed that there had been an improvement to the presentation of the statistics over the last few years as well as the commentary around them.
- Users also raised wider issues around the context of the statistics, including:
- The focus on EBacc subjects which can result in students being pressured to take these subjects, also issues around the weighting and groupings of the subjects
- The use of them to compare schools
- The pressure they add to schools and teachers
Question 5 – What improvements to statistics on this topic would help you and why?
Respondents had a wide range of suggestions for improvements to the statistics, including:
- Improved metadata for the data files and being able to easily see which variables are in the files and their descriptions before downloading.
- More contextual information on socioeconomic factors
- More consistency year-on-year of the variables made available to users
- More information on the methods used including data linkage and Progress 8
- Additional breakdowns of the data such as ethnicity, the ability to combine several vulnerability variables and information on each specific subject
- Easier access to LA-level data, including being able to select all the LAs at once to download
Question 6 – Is there anything else that you would like to tell us about these statistics?
The response to this question were mixed with some respondents highlighting positive aspects of the statistics such as the improvements that have been made over the last few years, how clear the main points and summary tables are and the impartiality of the presentation. Other respondents were concerned about the decision to publish Progress 8 in 2022.
Questions 7a-c – Contact with DfE
Users were asked a number of questions about their contact with DfE. Almost half of users (43) had raised queries with DfE about the statistics.
Average score /10 | |
---|---|
7d. How satisfied were you with how DfE responded to you? | 4.9 |
Question 7d – Satisfaction with DfE contact
In general users were not highly satisfied with how DfE responded to them with an average score of 4.9/10.
Work sector | Count |
---|---|
Teacher/ head teacher/ governor/ trust leader | 35 |
Local government | 25 |
Parent | 11 |
Academic/ researcher | 11 |
Central government | 8 |
Charity/ think tank | 8 |
Other [ please specify] | 15 |
Question 7e – Please provide any comments that can help us understand your rating
Some users reported that DfE has been quick to respond to their queries and that the information had been useful. Other respondents raised several issues with their interactions with DfE, these included:
- Long delays between responses or no response at all
- Difficulty in getting data corrected after it has been raised with DfE
- Queries being passed to multiple different teams or people
- No ability to speak to DfE other than through email
- Some users reported having to raise a FOI requests to get the data they had requested
Question 8 – Work sector
Users indicated which work sector that best describes they work in (note – some users selected more than one). Most respondents classified themselves as a teacher/ head teacher/ governor/ trust leader or part of local government.
How satisfied are you with the following aspects of the statistics? | Average score /10 |
---|---|
Level of Quality | 6.5 |
How clearly are the statistics are presented | 6.5 |
Impartiality | 6.4 |
Use of pre-populated tables and charts | 6.2 |
Ease of access to the statistics | 6.2 |
Description of trends/ patterns (known as commentary) | 5.8 |
Information on the methodology | 5.5 |
Ability to build your own tables and charts | 5.3 |
Advice about strengths and limitations | 5.2 |
Ease of access to raw data for reuse | 5.1 |