Multi-academy trust league tables: What can we learn from the data?

By

This morning the government published multi-academy trust (MAT) league tables, building on an approach it trialled last year.

At a headline level, two thirds of MATs had Progress 8 scores that were below average across the secondary schools which they run [PDF].

But what does the underlying data tell us?

In this analysis we’re looking at the largest secondary MATs – those with five or more schools included in the league tables, of which there are 23. (In the league tables, schools are only counted if held by a trust for three or more years.)

Firstly, there’s quite considerable within-MAT variation. Looking at United Learning, for example, while it has an overall Progress 8 score of +0.1, individual schools’ scores range from -0.56 to +0.93 – equivalent to pupils being more than half a grade lower than expected in all subjects at one school, and pupils at another schools being close to a whole grade higher than expected in all subjects.

How does performance vary by type of school?

The real outliers in terms of under-performance are studio schools – a special type of academy, offering a more vocational education. We have previously written about accountability for schools which admit at age 14.

Beside this, as the chart below shows, it’s not the case, for example, that converter academies account for the best results in general in trusts. If anything, converter academies seem to have less variable (closer to zero) Progress 8 scores, with sponsored academies accounting for the more extreme values.

How does performance vary by length of time in a MAT?

We might also wonder how results vary by the amount of time with a MAT – we would hope, really, that those schools that had been with a trust for longer would be higher performing, if the trust was doing a good job.

Again, the picture is far from clear. It does seem like schools held for three to five years (those taken on most recently, in the data we’re looking at) are more likely to be the values pulling down the average for a trust – but wrapped up in this are different school types taken on at different times. (Converter academies only came into being in 2010, so all schools that have been part of a trust for seven or more years are sponsored academies.)

Overall, we think MAT league tables are a useful addition to the accountability data published by the DfE – but we’ll be putting out more analysis ourselves in the coming weeks on the variance seen within and between MATs.

By | 2017-01-19T19:00:36+00:00 January 19th, 2017|Exams and assessment, School accountability, Structures|

About the Author:

Philip Nye is a Researcher with Education Datalab, carrying out analysis and producing data visualisations. His particular research interests include academies and free schools, school finance, and Ofsted.

Leave A Comment