Last year the world of educational leadership research was rocked by a study, summarised in two Harvard Business Review articles (here and here), that introduced to the world the idea of ‘Surgeon’ and ‘Architect’ headteachers, among other types.
The findings, if more generally true, would radically re-shape the advice we should give governing bodies about how to recruit school leaders because they showed subject background of a school leader is strongly related to their leadership style.
“We decided to see if there was a pattern between leadership types and their subject. And it’s not just a pattern, it’s completely clear. Maybe the subject you chose to study is a measure of other factors – how you grew up and what you believe. With teachers there’s usually no external influence from other sectors. So you fall back on the subject. If you’re a PE teacher, for example, it’s about winners and losers.”
Co-author Alex Hill quoted in Schools Week
Given the scale of recommendations arising from the study, it is important that they are interrogated by other researchers. This isn’t straightforward because the articles themselves have no published methodology, as pointed out here.
This presents certain difficulties for us in trying to replicate the study’s findings. For example, the most recent HBR article is framed around the turnaround of ‘failing’ schools, although it is not clear whether this relates specifically to those in special measures. Yet the article then goes on to advocate for the appointment of more ‘Architect’ heads across the entire schools system, whether at schools in need of turnaround or not.
As such, our analysis doesn’t seek to replicate the HBR work – but instead asks whether these broader lessons are ones we should really be taking. It does, though, also look at things when schools rated good or outstanding are stripped out, to see whether we’d come to different conclusions if we looked at schools that were considered underperforming.
We can use the School Workforce Census (SWC) to see whether we can find a general association between school performance and subject background of leadership. We can do this for all secondary schools or for any sub-sample we choose. We identify the school leaders in the 2010 SWC and split the school’s subsequent performance into during and post-tenure periods, calculating the annualised change in GCSE results for each[1].
It is a ‘quick and dirty’ analysis. If we were really interested in answering these research questions we would spend more time carefully crafting the datasets and thinking through each analytical decision we make. We don’t do this. The corresponding advantage is that we are not data mining – trying out every possible specification until we find one we like!
On school performance during tenure:
They say: ‘Surgeons’ (mostly PE, RS and biology subject background) improve examination results dramatically in the one or two years they are at a school.
We say: subject background of senior leaders is not an important factor in explaining variation in school performance.
We take the annualised change in GCSE results for each senior leader between 2010/11 and the year when they leave the school, or 2014/15 if they do not leave their post.
We perform an analysis of variance (ANOVA) test to show whether the subject background explains any of the variation in GCSE improvements at the school.
The below table shows the average annualised percentage point change in the number of children achieving five A*-C GCSEs or equivalents, both for heads and for all members of senior leadership teams (SLT), with results also given for all schools and just for schools not rated good or outstanding by Ofsted (Ofsted 1&2)
There is certainly no evidence that leaders with a PE, RS or biology background have good GCSE improvements.
In fact, none of the relationships with subject background is statistically significant at the 95% level.
On school performance post-tenure:
They say: ‘Architects’ (mostly history, economics, music, physics subject background) have the most positive long-term impact on exam results – on average, 15 to 23% higher than other leaders. The schools of ‘Philosophers’ (mostly English, languages, geography, with no experience outside education) either coast or decline.
We say: subject background of senior leaders is not an important factor in explaining variation in school performance.
This time we take the annualised change in GCSE results for each senior leader from the time they leave the school until 2014/15 and perform an ANOVA test to show whether the subject background explains any of the variation in GCSE improvements at the school.
The below table shows the average annualised percentage point change in the number of children achieving five A*-C GCSEs or equivalents – which shows that subject background does not explain the variation in school performance after a head leaves.
When all members of senior leadership teams are considered, English shows a statistically significant deviation – but we’d expect to see one or deviations like this even in random data.
On leadership pay:
They say: ‘Surgeons’ (PE, RS and biology) were paid an average of £150,000 a year. ‘Architects’ (history, economics, music, physics) were paid about £86,000 a year.
We say: There are some differences in pay by subject, particularly among senior leadership teams rather than heads specifically. But the pay levels do not match the findings of the HBR study.
Looking at the mean pay of heads with different subject backgrounds, as the chart below shows, differences are not large.
We also can’t reproduce specific claims about pay. Very few secondary heads earn over £150,000 – in 2015, only around 45.
We do find salary differences by subject background of SLT. They do not correspond with the findings of the HBR articles. Interestingly, they are persistent when we include demographic background characteristics of the teacher – we will write this up shortly in a separate post.
On careers outside education:
They say: Some 86% of ‘Architects’ (history, economics, music, physics) had experiences outside education before teaching (normally working there 5+ years).
We say: Some subject groups have spent more of their adult life outside teaching before qualifying, but not the group mentioned above and not in similar magnitudes.
In the SWC we can use each teachers’ qualification date and age to calculate the number of adult years prior to qualification[2].
Most teachers who make it to SLT were qualified by their mid-20s (three-quarters had achieved QTS by their early or mid-twenties). The heads with an economics background did qualify slightly later than others on average, but only by about a year. The same is true of heads with a languages background.
And in any case, there is no evidence that spending time outside education before teaching improves leadership quality. The correlation between years before qualifying and GCSE improvement (either during or after tenure) is essentially zero.
Want to stay up-to-date with the latest research from Education Datalab? Follow Education Datalab on Twitter to get all of our research as it comes out.
1. A couple of notes on our methodology here. From the published HBR pieces it isn’t clear over what timeframe results have been considered. We work with copies of the School Workforce Census from 2010-2015, as the years for which data is available. This therefore isn’t likely to have perfect overlap with that used in the HBR work, but which shouldn’t lead to strikingly different results.
From the School Workforce Census it is possible to identify a subject background for around two-thirds of headteachers. Full details of how this has been done can be found here.
Full details of the sample considered can also be found here.
2. We do make the assumption here that teachers do not gain QTS and then work outside of teaching for several years before returning to the profession in large numbers.
This is all very perplexing. I have kids at school, so I follow this with interest. I marvelled at the BBC Newsnight coverage and have read the articles in The Times, The Guardian, SchoolsWeek, Forbes, Huffington Post, HBR and elsewhere. To see that certain leader styles impact student grades, and that something can be done to lift a school’s result, has very positive implications for my kids as part of Britain’s rising generation, how they’ll apply themselves in the workplace, and even our economic prosperity in the post-Brexit generation. To my mind, offering a roadmap for how school leaders can lift their game is the true value of what the research revealed. For TES and EducationDatalab to kick against all this seems very close-minded, and protective of the status quo the research challenged. I wonder who paid for this ‘new study’? Was it run with an agenda in mind? We may never know. I wonder how they replicated the original 7 years of research in just a few weeks to claim they have something anywhere near the same calibre? I wonder why articles like this claim the researchers never published their ‘secret methodology’ when even my 13 year old was able to find the original Oxford academic paper with a simple Google search. It’s here by the way: http://eureka.sbs.ox.ac.uk/6147/1/2016-13%20(2).pdf –and the detail is significant. No secrets at all. It’s lazy journalism, or an obvious bias, to claim otherwise. If EducationDatalab, TES and others are going to throw stones, best make sure it’s not in glass houses. Meanwhile, what is being done to drive the original research forward, get on the same page and put its findings into use? Wouldn’t we all like to see schools improve? Those with our students’ best interests at heart will do so. Those with another goal won’t, as this article shows. What a shabby disappointment.
Thank you for taking the time to leave a comment. I am sorry to hear that our little piece of administrative data analysis angers you so much.
The research you link to in your comment is a 8 school case study. The research we are interested in is their study of 411 school leaders. Can you provide me a link to the methodology for that part of their research please?
Who paid for the new study? Nobody. Education Datalab is part of the non-profit FFT Education Ltd who have been supporting schools in using data for over 15 years. When we have some spare time, we use to it do research like this. Nobody asked us to do it.
Was it run with an agenda in mind? Certainly. We aim to help policy makers and teachers improve schools.
I wonder how they replicated the original 7 years of research in just a few weeks to claim they have something anywhere near the same calibre? Simple. We used data collected by other people (i.e. the government) rather than collecting the data from schools ourselves. This way it is very simple and quick to see whether the correlations found in the original study can be found more generally in other datasets. Is our work as robust as the original research? Who knows… nobody is capable of judging the calibre of the original work because they have not published it in a peer reviewed journal.
I’ve read the PDF on how you identify and group subjects. Please would you tell me which table in the SWC data you used to identify the specialisms of headteachers for 2014 & 2015? I’ve looked at the tables provided on their website and can only find teacher specialisms.
Hi Becky (and team) – thanks for this really helpful analysis.
I was looking in what I take to be the underlying school workforce census data (accessed here https://www.gov.uk/government/statistics/school-workforce-in-england-november-2010-provisional – the zip files claim to have the underlying data) and I couldn’t see where in the SWC you would have found leaders’ subject backgrounds, or indeed anything on characteristics of individual school leaders, order to do this work. Am I missing something (quite probably!) or does FFT have access to additional non-published SWC data that includes this level of detail?
Thanks again, Ian