Today Policy Exchange have published a lengthy report ‘A Rising Tide’ claiming competitive benefits of free schools. The premise is that neighbouring schools have improved their results due to the threat of loss of pupils to a new free school. It is certainly possible that this might have happened. Any headteacher at an undersubscribed school will understand the enormous pressure to improve headline exam results to recruit children in a competitive market.

But we are unconvinced that Policy Exchange have demonstrated that free schools have successfully raised standards at nearby schools.

The problem they face is that not enough free schools have opened each year to warrant the sub-group analysis that they do, analysing each cohort of new openers separately and investigating impacts across many different types of nearby schools. (16, 19, 35 and 22 primaries and 5, 20, 28 and 26 secondaries opened in 2011-2014, respectively.)

With such small sample sizes it is important that analysis considers the changes that might have been possible by chance (i.e. statistical significance). They perform no statistical analysis of the data they present.

The general rule of thumb with small sample sizes is not to make the data work too hard. Just try to see whether there is an overall average effect across all nearby schools and accept insufficient data to do anything more sophisticated. Otherwise, it is easy for analysis to become a data mining exercise, endlessly cutting and re-cutting the already tiny sample until the desired result is found.

Policy Exchange find no overall effect of free schools opening on the standards of nearby schools.

Having found no overall effect, They make a claim that underperforming/deprived/undersubscribed (in data terms these are all almost the same thing) nearby schools have improved in response to free schools opening. But all schools like this have improved at a somewhat faster rate than other schools over the parliament. There are various reasons for this:

  • Threshold performance measures mean that already high performing schools have fewer pupils to take over the %5+A*-C at GCSE or %Level4 at Key Stage 2 threshold.
  • Regression to the mean (if a school has outlier results on its first measurement, it will tend to be closer to the average on its second measurement)
  • More deprived schools have received considerable additional money from the pupil premium over this Parliament

In their report they do compare these underperforming nearby schools to other schools in the bottom performance quartile nationally, but because of small sample sizes and a policy that has not been implemented at random we have no way of knowing whether this is a good comparison or not.

More seriously, their data consistently clearly shows mid-high performing nearby schools deteriorate in their performance compared to similar schools nationally. What plausible reason could there be for this ‘finding’, which they make nothing of in their report because it is rather inconvenient? And of course, neither should they make anything of this finding because all their analysis suffers from small sample sizes, no indication of statistical significance and regression to the mean problems.

I wish they had pooled their data to look at impact after one year and two years, regardless of year of opening. This is what we did on some work looking at social composition of free schools. It makes the statistical modelling a little bit more complex, with corresponding loss of easy-to-understand charts. But I think the trade-off is worth it.

Our quick look at whether there is a ‘free school effect’ at GCSE on the areas they serve

Dave Thomson has opened up the National Pupil Database to see whether he can spot exam results improving for pupils in areas served by free schools. Our sample and method is different to Policy Exchange:

  • We look at 9 free schools opened in 2011/12, 40 opened in 2012/13 and 61 opened in 2013/14
  • For each national end of key stage 4 cohort from 2010 to 2014 we identify the 500 pupils living closest to the site of the 110 free schools
  • We compare actual vs predicted results for the most proximate 500 pupils for each free school across a range of indicators
  • We use their predicted results, given their KS2 prior attainment, pupil and schools characteristics (e.g. ethnicity, free school meals, mobility), to compare how these pupils performed relative to other 'like' pupils nationally

Dave’s data is in the table below. It is hard to see any sort of pattern in the data. These areas do seem to have a general trend of improvement in standards (note many are in London), particularly for the cohort of 2014 openers. However, the trend is slight and appears to have started some time ago – long before the free schools were proposed and opened.

 

    2012 Openers 2013 Openers 2014 Openers
Indicator Year Actual Predicted Diff. Actual Predicted Diff. Actual Predicted Diff.
AC5EM 2010 50% 50% 0% 51% 50% 1% 54% 55% -1%
AC5EM 2011 53% 52% 0% 52% 51% 0% 57% 58% -1%
AC5EM 2012 57% 56% 1% 54% 53% 0% 59% 58% 0%
AC5EM 2013 57% 57% 0% 56% 55% 1% 60% 59% 1%
AC5EM 2014 56% 56% 0% 55% 54% 1% 59% 59% 0%
MEANGCSE 2010 4.48 4.53 -0.05 4.46 4.47 -0.01 4.67 4.71 -0.04
MEANGCSE 2011 4.46 4.48 -0.02 4.45 4.46 -0.01 4.69 4.73 -0.03
MEANGCSE 2012 4.65 4.60 0.05 4.53 4.53 0.00 4.73 4.75 -0.02
MEANGCSE 2013 4.56 4.57 -0.01 4.52 4.51 0.01 4.70 4.70 -0.01
MEANGCSE 2014 4.63 4.62 0.01 4.56 4.54 0.02 4.77 4.77 0.00
BEST8 2010 324.9 330.0 -5.1 325.1 325.3 -0.2 330.6 334.5 -3.9
BEST8 2011 337.0 335.3 1.2 331.5 332.4 -1.0 339.8 343.8 -4.1
BEST8 2012 343.5 341.8 1.8 339.5 339.2 0.1 346.0 348.0 -2.1
BEST8 2013 341.7 340.3 1.3 339.8 338.1 1.6 343.6 345.3 -1.8
BEST8 2014 331.8 332.5 -0.7 329.2 328.3 0.9 337.6 338.6 -1.1