Having committed to sit on an EPSRC panel for the first time in many years, I had my sights set on a post about the Shared Services Centre of the Research Councils. Clearly this is a popular topic for academics to target since I was pipped to the post; just a few days ago Telescoper vented his bile on them for their inefficiencies over his P60 forms at the end of the tax year. I have a different gripe to make. As long term readers will know, for some time I chaired what was uninspiringly known as Committee C of the BBSRC. I stepped down after a reorganisation last summer of the committee remit, taking it into fields about which I knew nothing – on the whole I believe a little relevant expertise is desirable in a chair, although not everyone may agree. Of late this has meant a diminution of posts on this blog about committee work and impact.
I thought that that BBSRC experience meant I had a fair idea of what to expect from the SSC, having seen BBSRC grants move from their own internal processes to use of this central facility. The transition had not been without hiccoughs, and the ‘paperwork’ we received for the first meeting after the switch was in a much less satisfactory state than that we had been sent for previous meetings, but we did manage to find our way around it. The ‘paperwork’ – both before and after the transition – was in fact on a carefully indexed USB stick, the weight of material we had to take to the meeting correspondingly light. The later problems came from the rather circuitous way things like referees’ reports were woven into the structure, making the whole thing harder, but not impossible, to navigate.
I cannot help comparing this with what was sent to me by the SSC for the EPSRC panel. In this case the paperwork came with no inverted commas around it; it was literally a mound of papers, loosely tied together with those awful green ties reminiscent of exam scripts. Furthermore, the information about procedures, scoring guidelines, details of where I was staying, the codes for referees etc were enclosed in a loose bundle which, by the time I’d dropped the pack on the floor a few times, were hopelessly muddled up. Why? Why do I have to strain my back lugging this stuff around? The thickness of the wodge of papers was such it has destroyed the spring on the carrier at the back of my bicycle (should I add the cost of a new one to my expenses claim?). Why are they determined to destroy my new year’s resolution of struggling towards a paperless life? I am baffled by the fact that the SSC – let me remind you it stands for Shared Services Centre – actually operates completely different systems for different research councils.
I cannot believe this leads to ‘efficiency savings’, as no doubt ministers were promised in some delivery plan. As has been pointed out, like many other initiatives designed to save money, in fact it has done anything but. When I pressed the Secretariat, it transpired I could have asked for stuff to be sent on a CD, but as I don’t have an internal CD drive on my laptop I’m not sure that would really have helped. I certainly wasn’t asked which I preferred in advance, which perhaps would have been the ideal solution. Perhaps the majority of EPSRC relevant people really do still want ‘paperwork’ in its genuinely paper incarnation, but I find that slightly surprising. I did manage to get the forms on which to write my comments to be sent electronically, so that I could type up my responses – but again that required a special request (quickly acceded to, I should add).
There are other things that have been revealed to me by sitting on an EPSRC panel again after a long gap. It shows once again how tribal we are, how different (sub)-disciplines hold variable views about ‘normal’ behaviour, and how much this may colour our funding landscape. The EPSRC wish to ‘shape capabilities’ but are also keen to stress (as stated in a letter from their CEO David Delpy and Chairman John Arnitt to Nature) that
‘Research excellence remains pre-eminent and the Council will continue to support applications that are deemed excellent by peer review.’
Hence, those disciplines which are more collegial – or perhaps less critical – will continue to thrive over those where internecine warfare between camps or simply a more judgemental attitude to filling in referee forms and scores holds sway.
This is a worry. Without wishing to pick out any particular area over another (apart from anything else the sample size is too small to do that safely, whatever my suspicions may be), I note that a number of years ago when I sat on the now defunct EPSRC Physics SAT (Strategic Advisory Team), sufficient unease was manifest about scoring differentials between communities that the EPSRC carried out its own internal audit. This confirmed concerns that referees from one particular sub-discipline were inclined to give the top score willy-nilly, whereas other fields were more severe in their scoring and inclined to use the full range available. That particular sub-field was rapped over the knuckles and, I presume, cleaned up its act at the time. I can’t help questioning whether similar differences in behaviour are not still prevalent. This, in times of cash shortages and sculpting of the portfolio, means that without some norm-referencing of scores coupled with a scrutiny about tone of comment, those areas where referees are tough – in word or number – will be at an even more severe disadvantage.
The trouble is that unless this matter is regularly revisited to check for discrepancies, this has all the potential to become a self-reinforcing scenario. Remember that EPSRC rules (unlike BBSRC) mean that the panel can only evaluate grants based on the referees’ comments/scores. They are not allowed to put in their own views where they differ from what may be written on the referee forms. Again, this is different from BBSRC practice where the panel may choose to feed in their own specialist knowledge (although an EPSRC panel can choose essentially to reject a referee’s submission if there seems some clear reason why they should, for instance where there seems evidence that the referee hasn’t read the proposal carefully enough or might have a conflict of interest which hadn’t been initially spotted). What worries me is the tendency for referees to award the highest score, not always backed up by the tone of their comments. I wish the community would recognize how unhelpful this is.
The weird thing is, to revert to my previous experiences with BBSRC, I never saw this behaviour there. Comments across the board tended to be more critical, and more differentiated in tone. (My gripe with BBSRC refereeing would be different: that community seems to be in thrall to the need for a hypothesis or two to act as the focus for a proposal. No hypothesis generally means no funding, although it is a hopelessly crude discriminator in my view.) I am not sure if the differences are endemic to physicists, or reflect the fear that EPSRC referees subconsciously feel that they should ‘Judge not that ye be not judged’. However, I think this is a potentially damaging situation that should be carefully monitored. Maybe when refereeing I too should join the club of simply awarding the top score, along with bland comments that provide the panel with little assistance.
The EPSRC obviously tries to overcome disciplinary differences by having a group of ‘rovers’ who visit all the panels taking place on a given day, with the aim of tensioning between disciplines so that rank-ordered lists can be appropriately interwoven. But I don’t believe that removes the problem of some areas having a tradition of making detailed critical comments, and others tending to say – even if at some length – something that amounts to the team is good, the science is interesting, this should be funded.
I’d be interested to know if readers who have served on panels of different sorts and in different parts of the Research Council family share similar anxieties.