The correlation between Key Stage 1 and Key Stage 2 value added


How well do schools’ value added (VA) scores at Key Stage 1 correlate with their value added scores at Key Stage 2?

Do schools with high scores on one tend to achieve a high score on the other?

Or do schools with low VA scores at KS1 tend to achieve high VA scores at KS2?

Or are the two things completely unrelated?

We cracked open the data to find out.

Back to the days of Levels

In layman’s terms, if progress were consistent throughout school we would expect to see a positive correlation between KS1 and KS2 value added scores.

Meanwhile, if schools with low VA scores at KS1 tend to achieve high VA scores at KS2 – if, for instance, some schools downplayed KS1 – we would see a negative correlation.

Or if KS1 and KS2 progress were completely unrelated we would see no correlation.

New KS1 and KS2 assessments were introduced in 2016, and all such new assessments tend to take a few years to settle down. So, to begin with, we’ll go back to data from the days of Levels, initially looking at overall value added (reading, writing and maths) from both Foundation Stage to KS1, and from KS1 to KS2.

These indicators come from FFT Aspire and take account of gender and month of birth as well as prior attainment. For now we will leave to one side issues of reliability in both teacher assessments and tests.

The chart below shows KS1 and KS2 value added scores for 2012, for schools with at least 10 pupils at both KS1 and KS2 (i.e. excluding infant and junior schools). The scales are measured in fine grades, so 0.5, for example, denotes half of one Level.

The chart shows a very limited correlation between the two sets of scores. The slight, positive correlation (r=0.15) is just about discernible by eye, but it certainly does not follow that schools with high KS1 VA scores have high KS2 VA scores as well. There are plenty of schools with high KS1 VA scores and low KS2 VA scores (and vice versa).

The correlation of 0.15 is low – but we should not expect a particularly high correlation.

Value added indicators, particularly for small primary schools, are relatively volatile from one year to the next. The correlation between schools’ 2012 and 2013 KS2 VA was 0.6, for example.

This is largely because differences between the majority of schools’ VA scores are just noise. Or, put another way, the VA scores of the majority of schools are pretty much the same, to all intents and purposes.

Rolling forward to 2016

Now let’s roll forward to 2016 and try to repeat the analysis.

In 2016 there were new assessments at KS2, with tests measured in scaled scores in reading; grammar, punctuation and spelling (GPS); and maths. For the purposes of this post, we’ll use a KS2 VA measure based on the average of these three scores, with maths double-weighted.

At KS1, pupils were assessed in reading, writing and maths at one of three standards, supplemented by a pre-Key Stage standard for those working below the expected standard.

A fairly crude measure of overall attainment at KS1 can be fashioned by using factor analysis. This attempts to construct an underlying scale which correlates with the three KS1 teacher assessments. This measure, which has a mean of zero and a standard deviation of one, can be used to construct an FS-to-KS1 VA measure.

Following this approach, the correlation between KS1 and KS2 VA in 2016 was slightly higher than that in 2012, at r=0.21.

Thinking about one cohort

So far we’ve compared VA scores for KS1 and KS2 at two fixed points in time – 2012 and 2016.

The pupils who were assessed at KS2 in 2016 would have been, for the most part, assessed at KS1 in 2012. So we might wonder what the correlation is between KS1 VA in 2012 and KS2 VA in 2016.

This time we see a small but negative correlation (r=-0.16), indicating that schools that had high KS1 VA scores in 2012 tended to have lower KS2 VA scores in 2016.

We can only speculate here about the reasons for this.

Firstly, the problems of using KS1 as both an output measure and as an input measure in VA calculations are well documented.

Secondly, we may well be seeing regression to the mean here. Pupils with high value added at KS1 will tend to have below average value added at KS2.

Finally, a lot can happen in schools in the four years between 2012 and 2016.

An alternative to using KS1 as a baseline for KS2 VA would be to use Foundation Stage (FS) data, as we describe here.

Doing this gives a correlation between 2012 FS-KS1 VA and 2016 FS-KS2 VA that almost vanishes (r=0.05). In other words, value added from FS to KS1 is not a guide to value added from FS to KS2.

So what does all this mean?

As we’ve written before, pupil progress is idiosyncratic.

Consequently, it would be highly unusual to observe all year groups in a primary school achieving consistently high value added scores, even if it could be reliably measured for every year group.

But given that KS1 VA appears unrelated to KS2 VA, it would probably make sense not to read too much into it.

Want to stay up-to-date with the latest research from Education Datalab? Sign up to our mailing list to get notifications about new blogposts, or to receive our half-termly newsletter.

By | 2017-08-02T11:57:50+00:00 August 2nd, 2017|School accountability, School improvement|

About the Author:

Dave Thomson is Chief Statistician at FFT with over fifteen years’ experience working with educational attainment data to raise attainment in local government, higher education and the commercial sector. His current research interests include linking education and workplace datasets to improve estimates of adult attainment and study the impact of education on employment and benefits outcomes.

Leave A Comment