The post below was written as a comment on Lizzie Gadd’s recent post explaining in some detail Loughborough University decision to base their approach to research assessment more on the Leiden Manifesto than DORA, the Declaration on Research Assessment. So you should read that first! (The comment is currently ‘in moderation’ because, like myself, Lizzie is on holiday. I suspect she is more disciplined that I am at not looking at her email whilst on holiday. I’ll update this post once the comment is approved.
Update (18-07-18, 15:30) – the comment has now been approved. I suggest any further discussion takes place beneath Lizzie’s original post.
Even though I am currently chair of the DORA steering committee, I don’t want to get into ‘theological’ arguments about the differences between DORA and the Leiden Manifesto because they are both forces for good! Moreover, I am sure that Lizzie agrees with me that ultimately it is the development of good research assessment practices that matter and I again applaud the work that she has been doing on that front at Loughborough.
Nevertheless, I want to argue for a more expansive interpretation of what DORA means (as a declaration and an organisation) than is presented here.
It is perfectly true that DORA was born in 2012 but it would not be correct to suppose that the declarationis any more fixed in time than the Leiden Manifesto. Although the first and most prominent recommendation of the declaration is stated negatively (“Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions.”), the remaining 17 recommendations are almost invariably positive, encouraging adherents to think about and create good practice. It does not limit how they should do that. The declaration is not very long so I would encourage everyone to read it in full.
Nor should it be supposed that DORA’s relevance is confined to the sciences; it has always aimedto be “a worldwide initiative covering all scholarly disciplines”. Admittedly, the work to extend that coverage has been lacking, but as the recently published roadmapmakes clear, we have now placed a particular emphasis on extending DORA’s disciplinary and geographical reach. We are in the process of assembling an international advisory board from all corners of the globe. We would be glad to hear from arts, humanities and social science scholars about how DORA can help them to promote responsible research assessment in their fields.
Lizzie draws a careful distinction between not using journal metrics to assess the quality of research outputs and using them to assess ‘visibility’. She is right to do so because there is a risk it might send a subliminal message to researchers. It would be interesting to hear from Loughborough’s researchers how they interpret the guidance on these points.
This distinction is the basis of Lizzie’s argument that, because Loughborough wishes to incentivise it’s researchers to make their outputs more visible, they could not in good conscience sign DORA. I can see how that is an honest interpretation of the constraints of the declaration, but my own view is that it is too narrow. The preamble to the declarationlays out the argument for the need to improve the assessment of research ‘on its own merits’. This and the thrust imparted by the particular recommendations of the declaration show that it is the misuse of journal metrics in assessing research content – not its visibility – that is the heart of the matter. It seems to me that Loughborough’s responsible metrics policy is therefore not in contravention of either the letter or the spirit of DORA.
In the end, as Lizzie rightly states, it is Loughborough’s call and, again, I am sure that Lizzie and I have in common a strong desire to promote good research assessment practices. I stand by what I wrote back in 2016, in a piece bemoaning the fact that so few UK universities had yet to sign DORA:
“I would be happy for universities not to sign, as long as they are prepared to state their reasons publicly. They could explain, for instance, how their protocols for research assessment and for valuing their staff are superior to the minimal requirements of DORA. It’s the least we should expect of institutions that are ambitious to demonstrate real leadership.”