An Education in Education (and Policy)

For the last three and a half years I have been chairing the Royal Society’s Education Committee. Under Secretary of State Michael Gove, education in England has been going through a tumultuous time (other parts of the UK have been going through their own periods of turmoil). Consultation after consultation has poured out from the Department for Education (DfE) offices to which we have tried to respond. Changes to the curriculum and to assessment have been rushed through and the consequences of all this will take years to ascertain. However, I think it would be true to say that the education community collectively is far from happy that their views have been adequately listened to or that any of the responses to the myriad consultations have made significant impact on the policy as finally implemented.

The Royal Society is one member of the group collectively known as SCORE, the Science Community Representing Education. If you look at the SCORE web pages you will be able to find the responses that have been jointly submitted to DfE on everything from practical work to the curriculum content. Most of them express grave reservations about some part of the new proposals or another. Relevant to a recent post of mine on experimental work, one of the more recent concerns was tied to the removal of practical assessment in science A levels’ overall grade. SCORE believes this risks schools spending less time than is needed on teaching practical skills, something crucial both to excite the students and to enable them to acquire the techniques and know-how necessary if they intend to pursue science further.

One may hope that teachers and schools will be heading for a period of relative tranquillity as all the changes Gove has rushed through bed down, but first they have to acclimatise themselves to the new landscape of curricula and assessment taking place simultaneously right through the school system. One of the things that seems to be universally deplored (at least outside DfE) is the conflation of assessing the child with assessing the school. As long as league tables are essentially based on exam results there will be the tendency, one might almost say the necessity for a school’s survival, to work at getting those children near any particular critical assessment boundary ‘up’ at the expense of the weaker and stronger who sit outside this critical zone. As long as it is only absolute grades (whether denoted by letters or numbers) that count, rather than a value-added contribution or seeing well-educated well-rounded students, we will see teachers having little choice but to teach to the test.

The Royal Society will be launching its major new report ‘Vision for Science and Mathematics Education‘ on June 26th, representing several years of intense work, evidence-gathering and consultations. I have been heavily involved in the committee which has been preparing the report – under the chairmanship of Martin Taylor – and it has been a fascinating experience. I sincerely hope it will lead to a more interesting dialogue with policy-makers about what this country needs if it is to produce both well-educated and scientifically-confident citizens as well as the scientists of tomorrow. But I will leave detailed discussion of the report until after its launch.

Both through the preparation of this report and more generally as Education Committee Chair I have had opportunities for interactions with policy-makers that had not come my way before. I wish they had. It has certainly been enlightening, though the experiences have not always been comfortable and occasionally positively surreal. What motivates ministers and civil servants can be very different from what a scientist would see as the ‘obvious’ driver and there is little point in pretending it is or could be otherwise. But unless you see this in action it is hard to appreciate how or why that should be.

To give one specific example of this let me cite the recently launched DfE initiative led, with all the best of intentions by the education minister Elizabeth Truss. This is the initiative which goes by the name of YourLife launched last month in an attempt to get more girls to stick with maths and physics at A levels and to consider careers in non-traditional fields notably engineering. How could that not be close to my heart? Its target age group is 14-16 year olds. When challenged as to why this age group was selected when career choices are often made, consciously or not, at the younger age of 11-12 (something research for the Vision Report threw up and so was fresh in my mind), a group of us were told that this was because they could then derive some easy metrics as to whether the Campaign was having any effect. So, for the policy-maker, quantifiable metrics and targets trump optimum outcomes in a very explicit way.

At one level, the purist scientist level, this is ridiculous but it makes perfect sense for the politician. That is why I believe we should be exposing many more of our talented young researchers to the world of policy through internships (such as at POST and GO-Science) and other more fleeting opportunities. It is why dialogues such as those that took place at the recent Circling the Square conference are important and why I’m excited to be involved with Cambridge University’s student-run Science Policy Exchange. It is too easy for scientists only to look at the ‘facts’ as they see it and not at the broader implications for those who have to implement them and sell them to their political masters or constituents.

I am quite sure my three and a half years leading on the Royal Society’s Education brief has improved my own education in ways I absolutely did not anticipate when initially invited to take on the role. I have, of course, learned much about schools, assessments, curricula and inspections. Those are the hard core facts I needed to be familiar with. But I have also learned, at least I hope I have, how to be more persuasive with those whose motivations are different, how to interpret nuanced civil-servant-speak and when to be blunt and insistent. I am quite sure on many occasions I have failed to hide impatience at what may seem unnecessary circumlocutions or evasiveness, but in other ways I have learned that going in feet first is often a stupid tactic however attractive it may seem at first sight. I have a long way to go in being a political master and, up to a point, I don’t want to be one, but the role has been challenging in a new way and immensely interesting.

You haven’t heard much about it on this blog because, as long as I could be thought to be the voice of the Royal Society on this front it would have been inappropriate to disclose much. You may hear more about it in the future now that I have stepped down. But what I can say is that it has absolutely reinforced my belief that, seizing opportunities is rarely a foolish thing to do even if what is being offered may feel not an immediately natural fit for what one has done before. I came to this role, not as an education expert but (I believe) regarded as a safe pair of hands to take the lead and as someone who wouldn’t, as it was put to me, ‘go native': someone who could safely look after a major committee. I hope I have delivered against that. I am intensely grateful to those who had confidence in me and gave me, accidentally as it were, a new set of skills to take on to my next role.

This entry was posted in Education, Science Policy and tagged , , . Bookmark the permalink.

2 Responses to An Education in Education (and Policy)

  1. Jo Rhodes says:

    I agree with your thoughts on exposing schoolchildren to policy makers at POST and the like; it’s a fantastic place and would really show a different avenue of science. I also agree that the initiative ‘YourLife’ should be aimed at 11-12 year olds. Surely aiming at 14-16 year olds is too late, when those students would have already made their GCSE choices?

  2. Mark Field says:

    Regarding teaching to the test versus well rounded students, it was obvious during my own undergraduate degree that students at some Cambridge colleges did much better at exams compared with others. A difference that was not seen during practical assessments or comparing answers to question sheets. My own view, bolstered by later experience as a tutor, was that a part of this was that the tutorials in the more successful colleges taught to the test.

    I’m not sure I agree entirely with your implied assertion that ‘purist’ science does not need metrics. I get your point that progress in pure science cannot be planned (‘we’ve scheduled a breakthrough for next Wednesday …’), however the sort of metrics the politicians are after are mainly about process, protocol and a plan of action which can and should be planned out. You can always change the plan as new data or understanding arrives, but there needs to be a plan to begin with. Just saying we are very bright people who will work it out is a recipe for wasting time, the one thing that is really in short supply. I always advise scientists who ask that they should be able to answer the Heilmeier questions* before launching into any project. This is really important in grant writing.

    * For the uninitiated, George Heilmeier was a US research scientist who did pioneering work in liquid crystal displays. During the 1970s he was director of ARPA (the predecessor of the current DARPA) and also Bellcore. He developed a list of questions that every research proposal should be able to answer, known as the Heilmeier catechism. They are certainly aimed at applied research, but are very relevant to pure science, I find them useful for pointing out where I am applying wishful thinking rather than a considered plan of action.
    For those interested the questions are here:
    http://www.design.caltech.edu/erik/Misc/Heilmeier_Questions.html