Friday 5 September 2014

The astonishing lack of an evidence base to support education policy

Maybe I am late to the party on this, or maybe it is because I have been out of education for a while that I can look at the situation with fresh eyes. In my last post I commented on the lack of evidence presented to support the government's line on setting by ability. However it turns out that this is far from the only area in which the government seems utterly uninterested in evidence when it comes to education.

To illustrate this point, it is really unclear to me where one should even go to find evidence that the government might be using to inform its policy-making decisions. Ofsted apparently conducts no research other than its 'surveys' of groups of schools and its dataset seems to consist largely of its own inspection reports. This is useful in its way, but hardly rigorous or objective as a means of cross-checking the assumptions that no doubt underlie said inspection reports. There is in fact an enormous wealth of such objective data available: Ofsted has access to a huge amount of achievement, attainment and contextual data for every school they inspect, but there is frustratingly little real analysis of this in any of Ofsted's reports or surveys.

I did eventually track down the "Research and Statistics" page on the DfE website. There are 143 publications about schools by the DoE, which sounds promising. However the very large majority are evaluations of specific trials, or papers about research priorities for the future. In fact I could find fewer than half a dozen that impacted in any way on the massive sweeping changes the government has been carrying out. "The evolving education in England" seems a promising title, but in fact the paper makes clear from the start that it does not contain any analysis of performance, but is simply a "temperature check." Not the sort of evidence I was looking for.

Ah ha! But there is also "Attainment in Academies at Key Stage 4". Finally, some data to support the government's policy of near universal academisation. Except that it doesn't, of course. The analysis in fact shows that outcomes at academies are broadly the same as those in "similar schools", despite the significant boost to funding that these academies received. There are ups and downs- academies for instance do less well by students in receipt of Free School Meals but are improving faster.

OK, so what about "Do Academies make use of their autonomy"? Even more inadequate as evidence to support policy, I'm afraid. It is full of emphatic statements about how marvelous it is for a school to become an academy, but remarkably devoid of evidence to support its claims. Here is an example of a key finding:
"This [autonomy as regards curriculum] is helping them raise standards for their pupils"
and the evidence provided?
"- Two thirds believe these changes have improved attainment"
Note, it says "believe". Not "can demonstrate" or even "have shown."

So what about Free Schools- that other enormous experiment? Well there are 850 documents available by search on the DfE's site, and a further 5 from Ofsted. I have to confess that I gave up after the first three pages of results, but I could see nothing that contained any sort of research or data analysis into their effectiveness. Nothing at all.

So what about PISA then? This is the one piece of objective data that Gove appears to have paid any attention to, repeatedly using its findings to rubbish the entire UK education system he inherited. I have to confess that until today I simply had not looked at the PISA test process or data. In fact their 2012 report makes for interesting, and on the face of it puzzling, reading. The focus in that report is on maths, and whilst "Pupils in England showed greater motivation to learn mathematics than the OECD average and reported a high sense of belonging and satisfaction with school," on the other hand "In mathematics, 19 countries significantly outperformed England."

Why is that then? Why are children who enjoy and are confident in maths not apparently learning it as well as those in other countries? Perhaps looking at the tests themselves might give a clue. Well, yes. Here are some sample questions: http://www.oecd.org/pisa/test/form/ . They are like the worst sort of maths textbooks from when I started in teaching. Textually dense and full of utterly irrelevant details and imagery they wrap up simple mathematical questions in nonsensical "real world problems" in such a way that the student actually has to decode the "real world problem" into an actual problem before they can answer it. This is a skill that is of no relevance outside maths lessons and the UK education system has quite rightly moved on from that approach, so students are less skilled at it than they might be in countries with more old-fashioned approaches.

Which is perhaps why students from countries with much more formal education systems do better in PISA tests. It doesn't really matter if you do not have as high a "sense of belonging and satisfaction", or even if you "enjoy the study of mathematics". If you have spent your entire school life endlessly practising pointless and artificial maths "problems" like those in the PISA tests then eventually you will get quite good at it.

But is that seriously the only evidence Gove had for tearing down decades of good practice, action research and pedagogical innovation in his time as Secretary of State? It was perhaps naive of me to think that the current regime retained some sort of commitment to evidence-based policy making. Clearly they haven't.


No comments:

Post a Comment

Contributors