The quest to find ‘London Effect’ – why are some groups of pupils making more progress than they used to?

By

A lot has been written in the search for a credible explanation for the improvement in attainment in London’s schools since the turn of the century.

It now seems that London’s schools have disproportionately benefited from improvements to the education system as a whole, with similar pupils and schools elsewhere in England improving by roughly similar amounts. But what has driven this improvement?

The Story So Far

The journalist Chris Cook reported in 2012 that disadvantaged pupils were doing well in London’s schools. A quest ensued to find out why.

Some initially attributed it to a range of policy initiatives that took place in the early part of the century: London Challenge, Teach First and the (City) Academies programme among others.

Simon Burgess at Bristol University thought otherwise. He showed that the “London Effect” can be explained away by pupil demographics and the effects of school composition. Put simply, London’s schools are doing just as well as similar schools serving similar pupils elsewhere in England, and no more.

In response, we (and Chris Cook) noted that there is a small “London Effect” when GCSEs alone are considered, even after pupil demographics are accounted for. And schools in London have tended to pursue more ‘traditional’ qualifications, eschewing the ‘equivalent’ non-GCSE qualifications entered in greater numbers elsewhere in England, particularly the north-east.

Then Jo Blanden et al produced a very thorough study which showed that the improvement in London began well before the introduction of the policy initiatives listed above. Blanden and her colleagues also agreed to an extent with Burgess that pupil demographics largely explain the level of performance in London. Demographics don’t explain the improvement witnessed in London, though. So what does?

How rates of progress have changed part 1: pupil groups

Rates of progress from Key Stage 2 to Key Stage 4 have changed over time. These changes have affected some groups of pupils, and some types of schools, more than others. London has proportionately more of the pupil groups and school types that have benefited, and this has driven up performance in the capital at a faster rate than elsewhere.

To look at this, we went back to 2004 Key Stage 4 data, and produced statistical models of the relationships between KS4 outcomes and:

  • prior attainment (Key Stage 2)
  • pupil characteristics (ethnicity, free school meal eligibility, gender, age etc.)
  • school characteristics (e.g. % pupils eligible for free school meals)

We then calculated predictions for the 2013 Key Stage 4 cohort (the same used by Simon Burgess in his analysis) based upon our 2004 models.

Then for each pupil, we calculated the difference between their actual outcome and their ‘predicted’ outcome. One point to note: the analysis we present here is based on a pupil’s mean grade in GCSEs only – we ignore equivalent, non-GCSE qualifications.

The mean differences are presented for each ethnic group in Figure 1. Nationally, pupils achieved 10% of a grade higher on average in each GCSE entered compared to their counterparts in 2004. Differences are slightly larger in London for almost all groups.

But some groups – particularly those from black backgrounds – did much better. And such pupils are disproportionately found in London’s schools.

Figure 1: Differences in progress Key Stage 2 to Key Stage 4 2013 compared to 2004 by ethnicity (mean GCSE grade)
18

We should highlight here that Pakistani pupils have improved the least. Indian, Chinese and mixed white/Asian groups have also improved by a less-than-average amount, though these groups are less of a concern as they were already high attaining and so were the least likely to improve further.

So why has the progress of some ethnic groups changed more than others?

How rates of progress have changed part 2: disadvantaged schools

We now repeat the same analysis, this time allocating schools to deciles based on the percentage of free school meals. Figure 2 shows that progress has improved the most in disadvantaged schools. In fact, the rate of progress was lower in the least disadvantaged schools in 2013 compared to 2004.

We leave to one side the issue of whether we’re observing the effect of genuine improvement, grade inflation or a bit of both.

Figure 2: Differences in progress Key Stage 2 to Key Stage 4 2013 compared to 2004 by school disadvantage decile (mean GCSE grade)

19

In 2013, 45% of pupils in London attended a school in the top three most disadvantaged deciles, compared to 17% of pupils in other regions.

Pupils from ethnic minority backgrounds were also disproportionately more likely to attend such schools, both in London and elsewhere.

This leaves us with a conundrum. Did the rate of progress of pupils from ethnic minority backgrounds improve because they went to the types of schools that improved the most? Or did the most disadvantaged schools improve because of pupils from ethnic minority backgrounds?

Speculate wildly

To try and narrow down an explanation we can reprise Figure 1, but this time just look at the most disadvantaged 30% of schools (Figure 3). This shows that pupils progressed by an extra one third of a grade on average in each GCSE, comparing 2004 to 2013. Some groups even progressed by an extra half a grade on average. There is no clear London advantage for most groups.

This suggests that the improvement observed in London schools might have been driven more by improvements made by disadvantaged schools nationally, than by improvements in the performance of the capital’s ethnic minority pupils over and above that of their peers elsewhere in the country.

Figure 3: Differences in progress Key Stage 2 to Key Stage 4 2013 compared to 2004 by ethnicity (mean GCSE grade), most disadvantaged 30% of schools

20

So what has driven the improvement in disadvantaged schools both in London and elsewhere? In their conclusions, Blanden et al suggest it could be down to the relentless focus on performance and low attainment wrought by Ofsted inspections, the National Strategies and floor standards.

But there may be other explanations. What do you think?

By | 2017-03-03T09:49:13+00:00 December 17th, 2015|Pupil demographics|

About the Author:

Dave Thomson is Chief Statistician at FFT with over fifteen years’ experience working with educational attainment data to raise attainment in local government, higher education and the commercial sector. His current research interests include linking education and workplace datasets to improve estimates of adult attainment and study the impact of education on employment and benefits outcomes.

Leave A Comment