When discussing the skills students pick up – and need to pick up – during their undergraduate courses in a subject like physics, I always highlight the fact that they learn how to be critical, notably about assumptions underpinning an analysis. What assumptions are the right ones and why? When might they not apply? And if the underlying suppositions break down, what else can be done to solve the problem? Keeping with this physics theme, this might be when the numbers aren’t large enough to assume a (simple) statistical approach is valid but that has far wider application. For instance, it is of great importance to psychology, as Dorothy Bishop often points out, but has been brought right home to me in the book by Michael Lewis I’m currently reading: The Undoing Project. (A book I am finding very enjoyable and thought-provoking.) Or perhaps, to return to physics, when a surface should no longer be treated as smooth or flat or when a problem can’t be solved with easy symmetry. Students, one hopes, get to grips with these sorts of assumptions during their course. I know it was a skill I personally found particularly tricky when I started my undergraduate course: why was one assumption more plausible than another when deciding which terms could be neglected? This struck the inexperienced me as very hard to grasp initially, but slowly I seemed to absorb the appropriate logic and tools.
At graduate level this need to be critical becomes even more crucial. I remember the PhD student who came to me absolutely aghast that they thought they had found a mistake in a paper. They simply didn’t realise that just because something is in print it isn’t infallibly going to be true. To some, it is a shocking revelation when it becomes clear that some papers are just plain wrong. However, we must ensure our students absolutely have the confidence to identify the flaws in a piece of work, never mind who has written it or how long ago. It is too easy not to spot the error. Sometimes it is no more than trivial and can be ignored. In this trivial category I recall a group of electron micrographs I published in a paper, images of the same area but under different imaging conditions. One of the group of three was rotated by 180o relative to the other two. It was pretty obvious when it was pointed out to me, since the features were so obviously still present even if upside-down, but it was mortifying to realise that I hadn’t spotted it myself.
Others will be much more subtle. Errors such as the one I was challenged to identify in a draft manuscript by my new professor, Ed Kramer, when I first stumbled into the field of polymers (I found it, to my great pleasure). In this case it was precisely the case of an assumption he had made which was, on reflection, false invalidating everything that came thereafter. These are the sorts of errors I would like to think we train our students to find, to look at an argument and try to poke holes in it. If they fail to create that hole then perhaps the theory is OK. But if they don’t look then they are not testing the ground before they build some massive edifice on top. Their employers, in academia or outside, will not be happy if this is how they set about the tasks provided. The oft-cited example of a bridge-building engineer who makes mistakes in their analysis is often given. An engineer who constructs something which falls down – or resonates horribly; think about the first version of the Millennial Bridge over the Thames where the construction was unable to prevent lateral sway exacerbated by pedestrians apparently unconsciously walking in step.
However, what has prompted my recent anxiety on the score of what we are teaching our students arises from the annual mountain of exam scripts I have just been marking. Our students are as bright and capable as ever. They learn the facts we present them with and can (mainly) regurgitate them in order. There will always be the odd – and worrying – student who manages to write pages and pages, often fairly illegibly but certainly totally irrelevantly, and gains not a single mark. These are the exceptions. Unlike the challenges Mary Beard (now Dame Mary of course) faces when she marks her essays as described in a recent blogpost of hers, by and large there are clear facts that need to be set down in physics plus diagrams to illustrate the behaviour of functions and parameters. Answers do tend to be right or wrong with not that much greyness in between. This makes marking physics papers an awful lot easier than many other subjects. We do double mark, but not as slavishly as seems to be required for subjects where subjectivity is more liable to creep in. More importantly, the role of the second marker is to check the addition of marks – you’d be surprised how often 2+2 can make an unexpected number by the end of a long stint of decoding handwriting. Second markers also need to check correct entry in the markbooks and to ensure that an isolated page has not been overlooked by the first marker (again, a common problem when students choose to add something late in the day in some random place in their set of answers). All these are crucial parts of our second marking system.
My concern relates to situations when we ask students to do something which requires different aspects of a course to be synthesised to give insight that a lecturer hasn’t explicitly mentioned or to extend their ideas to unfamiliar territory. These are the skills I’d like to think we’d taught them but which seem to vanish – at least under the stress of exam conditions. It is not enough that physics students can manipulate equations with confidence, important though that is, or crank through a solution to a complicated partial differential equation with equally complicated boundary conditions. We need them to be able to think about assumptions and extend their knowledge to less familiar territory. I am worried at the moment that isn’t always the case.
Perhaps it is for those reasons the CBI so often says graduates don’t leave universities with the skills employers want. We certainly don’t test oral communication skills by our written exams! Physicists get little opportunity for team working in our curriculum (unlike engineers) and we certainly don’t test that either. There are those who say exams are not a good way of identifying future leaders and stars in their future professions, whatever these may be. What they test is perhaps only part of what we need to test. I do my bit of marking; exams may always be with us. But I am anxious that the gap between what we teach, what we assess and what students and employers require may be widening. Absolutely this is not a criticism of students and perhaps it is physics-specific (though I doubt it). I have no easy answers. But I do hope someone has.
Dear Athene,
Re your last paragraph. In University Manchester School of Chemistry we introduced a team working third year research project and assessed it. It was an excellent and enjoyable addition I found, much needed as you emphasise including by industry. Also the student seminars in our second year on a well known topic and short talks by students on their final year projects go a good way to helping develop oral skills of undergraduates.
Just our two pennies worth,
Best wishes,
John
In my second year physics exams (at Cambridge) I spent a large portion of the exam on electromagnetism weeping and chewing my own finger. Yes, chewing. I did sufficient damage to the nerves of my own finger I lost sensation in it for many months. I wrote more or less everything I could remember, relevant or not, and still had spare time at the end. Suffice to say, my marks were less than stellar.
In my final year, I chose every practical and non-exam based option I could. I achieved brilliant marks in all of them, and appalling marks in my exams, finally achieving a 2:1, plus a prize for my practical work.
I’ve spent the past twenty years working as a physicist in industry, designing and building scientific instrumentation and I’m now a Fellow of the Institute of Physics. I have never faced the peculiar circumstances of a written exam again in my career, and still have nightmares about those university exams.
I don’t have an answer to the problem of assessing students, but I can categorically state that exams are not a certain way of determining who will or won’t thrive in the wider world of work!
As a further aside, when interviewing to find a suitable candidate to join me in my endeavours, I have not found that high grades from reputable universities necessarily correlate to any independence of thought, or ability to apply “book learning” to unfamiliar problems.