Measuring two-year retention post-16: what does it show?


Each year, the government of the day publishes performance indicators about schools and colleges. While ostensibly they provide the public (and particularly parents) with information, and so inform choice, they are also a lever to encourage the system to function in the way the government wants.

Indicators come and go as governments change. Plans for the year ahead, including details of any new indicators, are published in the annual performance tables statement of intent [PDF].

Of interest to us this year is a new indicator that will be published in the post-16 tables: a measure of the percentage of students “returned and retained”.

Briefly, this is the percentage of students starting Level 3 (A-Level and equivalent) study in Year 12, who go on to complete it in Year 13. Further detail is available in the 16-18 accountability measures technical guide [PDF].

We can understand why this measure has been introduced. Numerous correspondents have discussed with us the practice of some schools asking students to leave after Year 12, alleging that the schools are more concerned with league table standings than the interests of their students. In other words, there is a perverse incentive to encourage students to leave, in the race to improve indicators of attainment.

This is a difficult circle to square. Some schools are more selective in their Year 12 entrance criteria, with others more likely to give students on the margins of A-Level readiness a chance. Publishing a “returned and retained” measure may discourage the latter.

And, while the system could not cope with large numbers of students repeatedly changing their minds about which post-16 courses to study, some flexibility is required.

No-one would want students to persist with courses against their will, particularly if both the student and the school agree a change would be in everyone’s best interests. So 100% returned and retained is undesirable.

But how much is enough and how little is not enough?

How can retention over two years be measured?

The technical guide contains a fair bit of detail on how the measure will work, but to keep this manageable I’ve created my own version.

Firstly, I look at students on roll in Year 12 in the October school census of 2014 who are taking at least one Level 3 (A-Level or equivalent) qualification. These students are said to be “returned and retained” if:

  • they enter at least one Level 3 qualification in summer 2016 at the same school; or
  • they are on still on roll at the same school in the May school census of 2016.

The analysis is restricted to the (approximately) 2,000 schools with sixth forms. A small number closed entirely between October 2014 and summer 2016 so I exclude them from the analysis.

Some 236,000 students began Level 3 courses in 2014 in state-funded mainstream school sixth forms out of a national cohort of 619,000[1].

Overall, 80% of students who embark on Level 3 study are “returned and retained”. However, this varies according to the Key Stage 4 attainment of students.

A look at retention by prior attainment

In the chart below, I have divided our Year 12 starters into 20 evenly sized bands, or vintiles, according to a measure of Key Stage 4 attainment, Attainment 8. (Note that pupils without a 2014 A8 score are not included.)

The horizontal axis shows the maximum A8 value of pupils in each band. For example, the lowest 5% of pupils had A8 scores between 0 and 37.

It’s clear that there is a strong relationship between KS4 attainment and the probability of being retained for two years on a Level 3 course.

And, as we might expect, disadvantaged students (who tend to have lower KS4 attainment, even among those who progress to Level 3 study) are less likely to be “returned and retained” (70% compared to 82% of non-disadvantaged students).

A look at school-level retention

Now let’s look at schools. The funnel plot below shows the retained and returned rate for all schools with at least 20 Year 12 starters. The vertical axis has been centred on the national average of 80%, hence the maximal value is 20 percentage points (above the average). The solid pink lines denote 95% control limits and the dashed lines 99.9% limits.

As we can see by eye, there are more than 0.1% of schools with particularly high and low rates (it’s more like 20%), plotted outside the dashed lines – that is, more than we would expect from chance alone. There are some schools (particularly small ones) where the rate is more than 40 percentage points below the national average.

However, schools vary in their intakes – something which can be adjusted for, factoring in students’ Key Stage 4 attainment and characteristics (gender, disadvantage etc.). The following chart shows the outcome of this.


Doing so results in more schools being plotted within the funnel, indicating the importance of taking intake into account when interpreting retention rates. But there remain more than a handful with particularly high or low rates that may be worthy of deeper investigation.

Concluding thoughts

So, what conclusions can we draw from this exercise?

If this measure is designed to identify schools which have poor rates of retention over two years, the main conclusion is that two of the common warnings attached to school performance indicators will apply here.

Firstly, schools with lower attaining intakes will tend to have lower performance.

And secondly, schools with the lowest rates will tend to be small.

Want to stay up-to-date with the latest research from Education Datalab? Sign up to our mailing list to get notifications about new blogposts, or to receive our half-termly newsletter.

[1] The majority of the remainder would have been attending other types of establishment (e.g. sixth form or FE colleges, special schools, independent schools). In addition, a relatively small number of students would have been taking Level 2 (GCSE and equivalent) courses in state-funded mainstream schools while another small group would not have been in education, even though participation was mandatory.
By | 2017-08-09T12:17:37+00:00 August 9th, 2017|Post-16 provision, School accountability|

About the Author:

Dave Thomson is Chief Statistician at FFT with over fifteen years’ experience working with educational attainment data to raise attainment in local government, higher education and the commercial sector. His current research interests include linking education and workplace datasets to improve estimates of adult attainment and study the impact of education on employment and benefits outcomes.

Leave A Comment