Key Stage 4 performance tables 2017: Closing the gap just got harder

By

Today’s Key Stage 4 statistical release from the Department for Education shows the gap between disadvantaged pupils and others to be narrowing slightly [PDF]. This is based on the DfE’s gap index, which is calculated by ranking pupils according to their English and mathematics grades and then calculating an overall index.

It would appear to be consistent with data comparing threshold measures – attainment of grade C or higher in 2016 compared to grade 4 or higher in 2017. Here the gap has again dropped slightly from 27.5% to 26.9%

Whilst this is good news, things look a bit different if we look at Attainment 8 and Progress 8. Calculations done previously by the DfE and by FFT predicted that overall Attainment 8 scores would drop because of the change to the 9-1 grading system – mainly caused by the change in points allocated to legacy (A*-G) GCSE subjects in 2017.

Perhaps most important though, given that it is the headline accountability measure, is Progress 8.

Delving into the underlying data (downloadable from the DfE website) we can find scores for overall Progress 8 and for each of the four elements within Attainment 8 – English, mathematics, English Baccalaureate (EBacc) subjects and other (open) subjects.

The chart below shows the gap between disadvantaged pupils and others in Progress 8 scores for 2016 and for 2017:

In this chart a gap of 0.5 indicates that, on average and taking account of Key Stage 2 attainment, disadvantaged pupils attained around half a grade lower than for other pupils. This is roughly equivalent to four grades across 8 subjects.

However, the two series are not directly comparable since they use different scales. In 2016, the familiar 8-1 scale for GCSEs was used, with one point separating each grade. In 2017, the 9-1 scale was used for English and maths, whilst legacy GCSEs were scored using a revised set of points scores. We wrote about it here.

Taking the data in the chart at face value, the gap overall has widened but only very slightly. It’s a different picture, however, when we look at individual elements:

  • in English, the gap has widened substantially;
  • in mathematics, the gap as widened;
  • in EBacc subjects, the gap has narrowed;
  • in other subjects, the gap has narrowed slightly.

This seems to contradict the narrowing of the gap shown in the DfE gap index and also in threshold measures. It also contradicts the modelling we did previously which suggested that the gap in EBacc and open elements would increase slightly because of the different values given to A*-G grades.

So, what is causing these apparent contradictions? In previous years the gap in these elements has narrowed but this has been almost all due to a reduction in the entry gap – disadvantaged pupils taking more subjects that count, particularly in the EBacc element.

DfE data shows that the overall entry for EBacc has decreased slightly but that this is made up of an increase for disadvantaged pupils and a decrease for other pupils. It looks very likely, therefore, that a further closing of the entry gap has masked a widening of the gap caused by changes in points allocated to A*-G grades.

However, what about English and mathematics? As the DfE point, out the impact of the change to 9-1 grading on the gap index is likely to be small because it is based only on the ranking of grades. Threshold measures just show the percentage of pupils just getting at or above a given grade so they are not very sensitive to overall grade distributions.

A detailed analysis of grade distributions will be needed to understand this better. But some plausible explanations are:

  • relative to the A*-G scale, the new 9-1 scale gives greater value to higher grades;
  • the changes in the nature of the examination and syllabus have greater impact on disadvantaged pupils;
  • in English, the larger difference might be explained by the removal of the combined language and literature examination.

Does all of this matter? Should schools be concerned? If the changes seen in English and mathematics in 2017 are also seen when the 9-1 scale is used in EBacc subjects (2018 onwards) and other subjects (2019 onwards) then we might expect to see the Progress 8 gap widen further. This change might, of course, be masked if the entry gap continues to close.

For individual schools, trying to compare gaps in Progress 8 scores from one year to the next will be a minefield. Using something like the DfE gap index might give a better indication, but the methodology for that calculation – whilst appropriate for calculations at national level – can be misleading at school-level.

So, it looks like much work will need to be done to help inspectors, governors and others understand that any comparison of changes in Progress 8 gaps over time will need to be done with considerable caution!

Want to stay up-to-date with the latest research from Education Datalab? Sign up to our mailing list to get notifications about new blogposts, or to receive our half-termly newsletter.

By | 2018-02-02T09:08:46+00:00 January 25th, 2018|Exams and assessment, School accountability, School improvement|

About the Author:

Mike Treadaway is an associate research fellow at Education Datalab, and was previously FFT’s director of research. A former teacher, lecturer and LA adviser, Mike has over 20 years of experience in working with education data in the context of school improvement. Since founding the FFT Data Analysis Project he has developed models for analysing pupil progress, has led the processing, matching and data analysis for the National Pupil Database, and has provided advice to the Department for Education to support the development of new school accountability indicators and analysis of national trends in school performance.

4 Comments

  1. Matthew Scott February 23, 2018 at 2:25 pm - Reply

    Has there been any work done looking at the subjects that contribute to the open element. We would like ti see changes in entries, what courses are contributing to the open element. The attianment 8 score for the open element is higher than that of the open – wondering why – what subjects are increasing the attianment score

    • Stephen Down March 2, 2018 at 9:57 am - Reply

      One of the reasons that I think the Att.8/Prog.8 score is higher for EBacc than Open (I assume that’s what you meant!) is because it has first pick of the results. So if a child has taken more than 3 EBacc subjects, the best three will go in the EBacc bucket, and any left over may go into the Other bucket (unless the child already has three better non-EBacc subjects in there) – that automatically gives EBacc an advantage over Other.

    • Stephen Down March 2, 2018 at 10:54 am - Reply

      Just doing a bit of analysis on our GCSE results, and the subjects that are counting towards the EBacc slots, as a % of entries, are:
      Double science – over 95%
      Separate sciences, and geography – about 90%
      History – about 85%
      Computing – about 75%
      Languages – 65 to 75%
      (I have not included minority subjects like geology or languages typically taken by native speakers only)
      (Where a subject *might* be counting, eg a pupil with BCC in the EBacc slots but has 3 Cs, I have considered them all as counting)

      With the exception of double science, between 20% and 40% of entries in EBacc subjects could also be counting towards Open slots. Again, there’s a caveat around pupils with multiple grades the same, and this doesn’t consider whether they *need* those subjects or if they can fill the Open bucket without putting any EBacc subjects in and still get the same score, so there will be an element of over-estimating there, but it shows how the system is skewed in favour of EBacc scoring higher, if all other factors are equal.

      • Dave Thomson March 2, 2018 at 12:15 pm - Reply

        Hi Stephen. I did a very similar piece of analysis on the national data last year for an ASCL conference. Yes, assuming equal scoring between qualifications/ subjects, then A8 is skewed towards EBacc subjects by design. However, we know that grading in MFL tends to be more severe than other subjects. Around a quarter of entries in French and Spanish were too low to be counted in A8 (I did the same as you and counted tied grades). On average, 10% of all entries were not counted. This figure was much lower for core science, additional science, triple sciences, geography and history. And ECDL. Almost a quarter of GCSE computing grades not counted, which suggests severe grading. Will probably blog on this a bit more at some point.

Leave A Comment