Building the Evidence

Evidence-based policy has become something of a catch phrase recently. My own MP, the scientist and my former departmental colleague Julian Huppert, used the phrase in his maiden speech, pledging  to support it, albeit other MPs may be less persuaded by its importance. The phrase also sits at the heart of Mark Henderson’s book The Geek Manifesto, so it perhaps isn’t surprising to find Mark, in his role as Head of Communications at the Wellcome Trust, being instrumental in setting up a dinner table discussion around ‘Evidence in Education’, at an event hosted by Mark Walport this week. (As it happens, under the Geek Manifesto Pledge, in which 325 of us pledged to send a copy to an MP of our choice, I chose to send mine to Michael Gove. I did not get a response. )

I actually feel that the chapter on education in The Geek Manifesto is rather light in comparison with some of the other chapters. Perhaps at the time of writing Mark hadn’t spent as much time thinking about the issues involving schooling as he had about some of the other topics; or perhaps, because of my own involvement with the education agenda, I know more about the issues than about those in some of the other chapters.  I don’t usually blog about education matters since, as Chair of the Royal Society’s Education Committee, I don’t believe it is appropriate for me to comment on many of the current hot topics surrounding the curriculum, and the proposed changes to both GCSE’s and A levels emanating from the Department for Education. But Mark has encouraged those of us at the dinner to write about the event, albeit under Chatham House Rules (so that I won’t be telling you who said what) so I’m following up on that encouragement. However, it will hardly be surprising if I tell you that there was a consensus that getting good evidence and carrying out well-posed research in education is something for which there is a crying need, but relatively little is yet being carried out.

It being a Wellcome event, many comparisons were drawn with the situation with (clinical) medicine. Here evidence is typically garnered by Randomised Controlled Trials (RCT’s). These may have at their heart the reality that medicine is a far from exact science, but they are a crucial way in which our knowledge increases. Could the same be used routinely to uncover what actually ‘works’ when it comes to teaching? One example of such a trial, that Wellcome themselves have recently initiated in conjunction with the National Science Learning Centre, concerns an exploration of how effective Continuous Professional Development may be for primary school teachers. This project is specifically looking at the effectiveness of a new 24 day programme aimed at helping teachers without science degrees gain confidence in teaching science in primary schools. This is an important matter, given the shortage of such teachers with any sort of formal scientific training beyond GCSE.

One theme that came out time and time again over the dinner table was the very different nature of the teaching and medical professions. The latter has a well-defined infrastructure involving professional bodies such as the BMA, professional journals such as the BMJ and funders beyond government including the Wellcome Trust; the equivalents for the teaching profession don’t really exist and the career structures are different too. Is this infrastructure necessary to facilitate a research culture? What would it take to create a culture within the teaching community that rewarded practitioners moving from full time in the classroom to carrying out research and then back again, and would that solve the current problem of lack of research evidence? The phrase ‘researcher – practitioner’ was bandied about. However we also need to ask if our teachers, by and large, have the confidence and ability in research methods or statistical analysis to make this viable anyhow?

More politically, one can ask if as a nation we could move away from diktats from government which, correctly or not, are seen as ideological and not obviously based on evidence. Again the distinction with medicine is very clear; no government official would try to lay down what is ‘best practice’ in the surgery or operating theatre, yet by implication such guidance is often offered by the DfE when practices such as the literacy hour or the teaching of phonics are introduced. The question was raised whether the government is simply stepping into the void which in medicine is plugged by the BMA.

The discussion came back to this theme of the (lack of) research infrastructure in education repeatedly, but the specifics of which questions need to be answered were less readily teased out. From my perspective I found the conversation a little frustrating as it rarely was STEM-specific, more often appearing to concentrate on topics such as struggling readers (as manifest by the examples I give above) where there already is ample evidence of what works and what doesn’t. I would have welcomed a more focussed look at the problems that are relevant to our rather poor standards in numeracy across the board, or considered what science teaching is effective in maintaining a child’s innate curiosity about the world around them as they move from primary to secondary school. Here we do have some data on trends and it tends to show this is a critical stage at which children lose interest in science despite enthusiasm at younger ages.

Wellcome will be continuing to explore these themes. They are clearly close to Mark Walport’s heart and I would expect that his interest in the importance of securing sufficient numbers in the scientific pipeline will continue as he moves to take up his new role as the Government’s Chief Scientific Advisor in the next few weeks. If, in that role, he can facilitate better evidence-gathering and a clearer appreciation amongst politicians across the political spectrum of why this matters in our classrooms as much as in our hospitals, then maybe in due course we will see a development of the researcher community and a higher value placed on research involving the necessarily large numbers to convey confidence in the outcomes (ie not merely from single schools, which will always have their own particular features and  cultures which may make findings impossible to replicate elsewhere). That would be an encouraging step forward.

This entry was posted in Education and tagged , , , . Bookmark the permalink.

10 Responses to Building the Evidence

  1. If I recall correctly, Mark pointed out the lack of teachers trained to degree level in science and engineering, especially those with postgraduate qualifications, despite significant financial incentives over the past 10 or so years for STEM graduates to enter teaching. I don’t know how much this has been explored in terms of educational research.

    Not sure how the teaching profession, which already perceives itself to be weighed down by paperwork, would like the idea of becoming “research-practitioners.” I believe in medicine, junior doctors have to do audits of clinical practice as part of their training, so perhaps a similar study for teachers could be included during PGCE/NQT periods.

  2. Wasn’t Ben Goldacre at the dinner? The cabinet paper by him and others set the agenda very well. Recently he and I visited UCL’s head of medical education and as a result some of their proposals will now be tested in proper RCTs, rather than being imposed on staff at the whim of sociologist.

    Goldacre has also pointed out, very cogently I think, that RCTs are avoided by many education ‘experts’ not only because they take more work, but also because quite often they show that ideas which seemed good on paper just don’t work. And that’s exactly why they are so essential.

  3. Thanks for the very interesting blog post. As a chair of governors of a primary school and a STEM academic, this is all of central interest. Do you have a reference for your comment:

    Here we do have some data on trends and it tends to show this is a critical stage at which children lose interest in science despite enthusiasm at younger ages.

    I would be fascinated to read more about these data.

  4. Pingback: The Evidence for What Works in education | Wellcome Trust Blog

  5. Laurence Cox says:

    Also see this article from The Independent
    http://www.independent.co.uk/news/uk/politics/government-to-try-crowd-sourcing-key-policies-to-see-what-works-8518515.html

    “It is, one might think, what a good Government would be doing anyway. But after years of policy disasters and U-turns for ministers of all parties, the Coalition is to establish the first network of bodies to examine which government initiatives actually work – and which do not. Based on the model of the National Institute for Health and Clinical Excellence (Nice), which provides evidence-based guidelines on all aspect of healthcare to the NHS, the new institutions will extend the concept to four other areas of policy.”

    This will presumably be extended to education, although it’s not one of the first four areas; the nearest is early intervention in childhood.

  6. I was recently at a conference pn education where teachers from St Mark’s Academy http://www.stmarksacademy.com/ spoke very positively about being research practitioners. This was clearly helped by a dynamic and supportive head. Leadership is important here.

  7. Margaret OHara says:

    Do teachers, by and large, have the necessary confidence and ability in research methods and statistics? I think the short answer is no. This, from my experience of ten years as a secondary physics teacher, then retraining and now working as a research physicist in medical physics. I would have had the confidence, and I would have said, sure I know about statistics if you had asked me when I was teaching. In fact, I would have had the ability to do the research, but no way the stats, not knowing how much more I’ve had to learn to do my research. The stats I did in my physics degree was geared towards error analysis, rather than hypothesis testing. Being a physicist, I didn’t mind learning more statistics, and, with training, most science and maths teachers would be fine. by and large though, everyone else will run a mile. If teachers are going to do research, they will need LOTS of statistical support. And research method teaching in PGCE courses will need serious beefing up, mine was pretty paltry.

  8. Dr Hilary Leevers, Head of Education and Learning at the Wellcome Trust, has written up some of the background to this event and what is happening in the fast changing landscape of evidence in education as the Government announces a new What Works Network.

  9. Athene – thanks for this helpful blog, particularly good for those that weren’t there. Dr Hilary Leevers background briefing also useful ‘scene-setter’.

    But it’s worth linking up with a number of other activites: the Coalition for Evidence-Based Education (http://www.cebenetwork.org/) and the piloted Education Media Centre (http://www.cebenetwork.org/projects/education-media-centre-%E2%80%93-enhancing-use-evidence-media ). I think the Centre could actually be a great knowledge mobiliser for policymakers, teachers, governors and public – not just journalists.They are key players in this debate. It’s vital that STEM and social science link up on this area as much we can learn from each other. Also for teachers to learn from other practioners such as policing and social care, not just medicine: http://www.alliance4usefulevidence.org/event/fdsa/ . Many of the issues cut across a variety of professional domains.

    The DfE have recently invited Ben Goldacre and others to review their own approach to educational research and evaluation, announced next week. It will be fascinating to see how civil servants in Whitehall – not just schools and local decision-makers – grapple with good evidence.