ORCIDs set to bloom in 2016

Have you got an ORCID identifier yet? You should. They’re on the rise – and for good reason.

An ORCID iD is a number (mine is 0000-0002-0552-8870) that unambiguously and persistently identifies you in the digital world of academia. It ensures that your research activities, outputs, and affiliations can be easily and correctly connected to you. They are currently used by over 200 research and workflow platforms to identify and connect researchers with their grants and papers, at universities and research institutions, at funding agencies, and at publishers.

ORCID - SCurry

Around the world over 1.8 million researchers have registered for an iD, many in the hope that it will enhance their digital discoverability and reduce their reporting paperwork. Several funders have started to require ORCID iDs as part of the grant proposal process. In the UK, the Wellcome Trust and the NIHR both do, and ORCID IDs are being integrated with Researchfish, the system used by the Research Councils to track grant outputs.

Universities are getting in on the act too. In 2014 my own institution, Imperial College, a created ORCID iDs for  every member of staff who didn’t already have one, unless they opted out. Very few did so.

The number of publishers using ORCIDs is also on the rise. Today a group of eight publishers have announced that, beginning in 2016, they will require authors to use an ORCID identifier (iD) during the publication process. These are AAAS (publishers of the Science stable of journals), American Geophysical Union (AGU), eLife, EMBO, Hindawi, the Institute of Electrical & Electronics Engineers (IEEE), the Public Library of Science (PLOS) and the Royal Society. The Royal Society has been quickest off the mark, making ORCID iDs a requirement for authors as of new year’s day. The rest will follow suit at various dates throughout 2016.

With luck, this move with spur other publishers soon to join in.

In a digital world ORCID iDs make a great deal of sense. Their adoption by institutions and publishers fulfils two of the recommendations made in The Metric Tide, last year’s report of the Independent Review of the Role of Metrics Research Assessment and Management (of which I was a co-author). If we are going to track outputs, we might as well do it systematically and efficiently. I look forward to the day when interacting with Researchfish will be trivial rather than tedious, as at present. In theory, ORCID IDs might even reduce away some of the burden of the ever-unpopular Research Excellence Framework (REF) – though of course we shall have to await the outcome of the government’s re-jigging of HEFCE and the Research Councils before the contours of the next REF become clear. I wouldn’t hold my breath.

Of course by automating the digital tracking of inputs and outputs of ORCID iDs raises the risks associated with   unthinking uses of metrics – something that The Metric Tide was keen to warn against. On that front we need to remain vigilant. But on a personal level, most of us want to be recognised for our work and have an interest in making sure that our published outputs are recorded accurately. The ORCID system also provides handy way of keeping track of your published work. Thanks to the good offices of Europe Pubmed Central, you can use your ORCID iD to follow the open citations to your work. Here, in the interests of transparency, are mine: https://europepmc.org/authors/0000-0002-0552-8870. The profile is perhaps not as complete as that provided through Google Scholar but it is at least open.

If you want an ORCID iD of your own, simply sign up and use the tools to identify yourself and your work (papers, conference proceedings, patents – anything). You can also add your grants, and your education & employment history. For all of the information entered, it’s up to you how much to make publicly visible.

Update (2015-01-07, 14:02): This post was modified to mention the fact that the AAAS will also be requiring ODID iDs in 2016.

Posted in Academic publishing | 11 Comments

Henry Gee’s top ten reads of 2015

At the tail end of 2015 I reviewed the 23 books that had entertained and enlightened me over the course of the year. My friend Henry Gee, formerly of this parish, managed nearly twice that number. In a guest post full of his characteristic wit, he lists his top ten. 

On going through the list of 41 books read this year, I was amazed to come across titles I’d completely forgotten I’d read (which single fact shows the benefit of keeping a notebook.) Some of these books were terrific, by super authors – but they seem to have gone down without touching the sides. Others were old favourites, though most were new (to me, anyway.) Two, including Nick Lane’s ‘The Vital Question’ were sent to me to review by The Literary Review. This is such a fine magazine that I bought my father (a big reader) a subscription. It must be good, After all, they ask me to write things for them.

So here, without further ado, is my shortlist of those titles that in my opinion stand out from the crowd, in that they managed to be enjoyable at the time and are likely to remain long in the memory. All of these were new to me. I have counted them down from ten, though of course such a ranking is pretty rough, especially as the books cover a wealth of subjects and styles.

10. Neil McGregor – A History Of The World In 100 Objects
The chief panjandrum of the British Museum uses objects from that venerable Institution’s vast collections to tell the history of human culture from the earliest times to the present day. It’s the kind of book you feel you should dip into, but when you start you just can’t stop, and end up reading it like a novel, wondering what kind of object awaits you beyond the next page. The fascinating histories of the individual objects and the wider context allow one to forgive a tendency to preachy political correctness here and there.

9. Nick Lane – The Vital Question
The insides of living cells, whether from bacteria or human beings, look remarkably similar, right down to the level of molecules. It’s something we tend to take for granted, but the reason why is perhaps the single most important unanswered question in biology. Scientist Nick Lane goes into the reasons, and in so doing pulls out enthralling conjectures and perhaps unstoppable hypotheses. That he does so with a brio that threatens to boil over is part of the book’s charm, although it repays very careful reading even for those who have a scientific background. An important book.

8. Steve Silberman – Neurotribes
The fact that I have a daughter diagnosed with Asperger’s (notwithstanding inasmuch as which I score well over 30 on Simon Baron Cohen’s Asperg-o-meter) naturally drew me to this comprehensive history of Asperger’s and Autism, from Hans Asperger’s ‘Little Professors’ in pre-war Vienna right up to modern day SF fandom. I knew quite a lot of this already, but before this book nothing I’d read pulled it all together. It’s not Silberman’s fault that I find his particular way with words somewhat grating (he is a journalist for the US magazine Wired and writes hipster bloggy-style) but I soon overcame this. What struck me most about the subject was how much we know about mental illness comes from Jewish doctors who fled the Nazis and flourished in the US – and also how such history, as much as medical knowledge, has shaped our appreciation of what constitutes personality traits and mental health.

7. David Adam – The Man Who Couldn’t Stop
A brave and frank account of a crippling affliction by – as it happens – a colleague of mine at Nature. Adam suffers from Obsessive Compulsive Disorder, and in this short book he tells his own story interspersed with what we know about its genesis and the various ways that mental illness is classified. Adam is now doing well after a long battle, but the wider story is not so happy, for it is apparent (and I know this from my own experiences with depression) that the treatment of mental illness has barely advanced beyond the leeches-and-bloodletting stage.

6. Ben Elton – The First Casualty
I first came across Ben Elton back in the 1980s when he was a comedian, and some of his novels (especially ‘Gridlock’) are side-splittingly funny. ‘The First Casualty’ is very much darker. It concerns a somewhat priggish and sanctimonious policeman, who, in the First World War, finds that a strict regard for logic and truth leaves him the wrong side of the law. Jailed as a conscientious objector he ends up released to investigate a case of murder on the Western Front in the hell of Passchendaele in 1917. The novel brings the horror of trench warfare to life like no other book I have ever read, the aim being to raise questions of what constitutes murder, when people are being blown to bits in the cause of war. The novel stays just the right side of preachy, and is an engrossing and terrifying read.

5. Robert Harris – An Officer And A Spy
A retelling of the Dreyfus Affair as the imagined memoir of one of the (real) people who was there at the time. Almost all the characters really existed, and the events happened more or less as stated. Harris is one of those thriller writers who really does their research and yet wears it lightly, turning the story into a terrific and memorable read. His best since ‘Pompeii’.

4. Virgil (trans. W. F. Jackson Knight) – The Aeneid
I’ve been revisiting the classics this year – Gilgamesh, the Iliad, the Odyssey – so one of these really had to make the cut. I loved Homer, but Virgil (in this English prose translation) was a real surprise. What stands out is the fusion of lush, lyrical phrasing with what – when it comes down to it – is unremitting and graphic carnage. Virgil took Homeric ingredients and polished them to an even more lustrous sheen. Those who criticize as inauthentic a movie such as ‘300’ – a highly stylized rendering of the Battle of Thermopylae – fail to understand that it pays much greater homage to the source material than people might imagine.

3. Wilkie Collins – The Woman In White
I read this after reading a modern novel cast as a memoir by Wilkie Collins (Dan Simmons’ ‘Drood’ – see below.) When I did, I could only ask myself where this Victorian ‘sensation’ novelist and friend of Charles Dickens had been all my life. True, a long time ago I had started ‘The Moonstone’ but hadn’t got far. Perhaps now I am old and wizened enough to appreciate this mixture of terrific writing, trashy melodrama, highly contrived whodunit and gothic styling. The story is a rather ropey mystery but what stand out are the characters (the creepy Count Fosco is the best.) After reading this you realize that Collins was something of a pioneer. ‘The Woman In White’ was a hit when it was first published in 1860. Writers of mystery with a yen for the gothic, from Arthur Conan Doyle to Agatha Christie to Daphne Du Maurier – are in Collins’ debt.

2. Charles Dickens – The Pickwick Papers
When people say that they don’t much like Dickens my usual response is that I never get invited to that kind of party. Like most these days I only ever read Dickens because his novels feature on school examination syllabi. I read parts of ‘Great Expectations’ as a schoolboy; ‘Hard Times’ for my own A-levels; and picked up ‘Oliver Twist’ as it was a set book for my sister’s O-levels. Only later did I realize how much I had enjoyed these, so after having read Dan Simmons’ ‘Drood’ and Wilkie Collins’ ‘The Woman In White’ (see elsewhere in this essay) I felt I needed a Dickensian education. ‘The Pickwick Papers’ was Dickens’ first novel, a rollicking portrait of England in the 1820s, before Victoria and most of all before the railways completely changed the face of England, when people got around by stagecoach and stayed at coaching inns. The social background was, for me, quite an education (helped immeasurably by editorial notes from Mark Wormald: mine was the Penguin Classics edition.) The story – it’s more a soap opera – is basically one damn thing after another as Mr Pickwick and his friends get themselves in and out of various comedic scrapes. It starts a bit chaotically and only gets into its stride when we meet Sam Weller, Mr Pickwick’s valet, confidant, source of homespun wisdom and everyday superhero, a cross between Falstaff and Sam Gamgee. The various meetings between Sam and his father are pure comedy gold, like Peter Cooke and Dudley Moore, Ronnie Barker and Ronnie Corbett – or Johnny Depp’s Jack Sparrow and Keith Richards in the ‘Pirates of the Caribbean’ movies. Amazing to think that Dickens wrote it when he was twenty-four. With little of the tub-thumping social crusading of his later works, Pickwick is pure enjoyment.

1. Dan Simmons – Drood
Another tale – qua ‘The Dreyfus Affair’ – done as an imagined memoir of real characters and events. This time the narrator is the novelist, hypochondriac and opium addict Wilkie Collins, and the tale is of the last years of his friend Charles Dickens. Brilliantly researched, it manages to be playful and fantastical within the confines of history. Simmons uses Collins’ opium addiction to give the fabric of reality a thorough work-out: reminiscent of Peter Shaffer’s treatment of Mozart and Salieri in his play ‘Amadeus’. Only more gothic. Lots more gothic. (I do LOVE gothic.) This novel got me back into the classics – I read ‘The Woman In White’ immediately afterwards, and ‘The Pickwick’ Papers’ soon after (though I have yet to tackle Dickens’ last, unfinished novel ‘The Mystery of Edwin Drood’.) I award ‘Drood’ the enviable accolade of my Read of the Year.

 

Posted in Book Review, Science & Art | 3 Comments

ICYMI No.1: Preprints for biologists

Since I have developed a habit of writing elsewhere, which necessarily takes time and words away from the blog here at Reciprocal Space, I thought I would try to make amends by developing the habit of linking to the pieces that appear in other corners of the internet. 

To kick off therefore, permit me to alert you to a short article this week published in The Biologist, the house magazine of the Royal Society of Biology. The piece – The power of preprint – follows from an earlier article on this topic in the Guardian and reprises the call, by myself and others, for more of us in the life sciences to adopt the practice of publishing our research quickly in preprint form. It’s worked in many branches of physics, maths and computer science for many a long year and I see no reason that we should deny ourselves the benefits of preprints in other research disciplines. And nor does Ron Vale, as I mentioned here a few months back.

Posted in ICYMI, Open Access, Science | Tagged , , | Comments Off on ICYMI No.1: Preprints for biologists

Jolly good fellows: Royal Society publishes journal citation distributions

Full marks and a side order of brownie points for the Royal Society: they have started publishing the citation distributions for all their journals.

This might seem like an unusual and rather technical move to celebrate but it matters. It will help to lift the heavy stone of the journal impact factor that has been squeezing the life out of the body scientific. The Royal Society has now joined the EMBO Journal in committing to be more transparent about the origins of this dubious and troubling metric.

I don’t wish to rehearse the details in full since I have previously described the pernicious effects of scientists’ and publishers’ obsession with journal impact factors and the value of making citation distributions available. But in brief: I hope that the ready availability of these distributions – to show the skew and variation of the citation performance of the papers in any journal – will enable researchers to develop a more sophisticated approach to the evaluation of the work of our peers.

The image below shows the citation distribution for the Royal Society’s Proceedings B journal – you can find the original by clicking on the ‘Citation Metrics’ link in the ‘About Us’ tab. As is the case for every academic journal, it shows that the impact factor, an arithmetic mean, is an indicator that over-estimates average performance and conceals a huge range in citation counts.

ProcB Citation Distribution

As I wrote back in June:

…the IF is a poor discriminator between journals and a dreadful one for papers. Publishing citation distributions therefore directs the attention of anyone who cares about doing evaluation properly back where it belongs: to the work itself.

So three cheers for the Royal Society for having the courage to be so open with these data!

Now: who’s next?

I have been discussing the idea of making citation distributions available with a number of other publishers and have heard some encouraging noises. It seems likely that there will be further significant moves in this direction in the new year, which I will be happy to report.

I dare to be optimistic that before too long the practice may become widespread, and that we may have at our disposal a tool that helps us to do a better job of assessing research and researchers. This is by no means a revolution and we all know that old habits die hard. Even so, this is a step in the right direction and I will take what I can get.

Update (05/12/15, 00:55): Better late than never, but I really should have thanked the Royal Society’s publishing director, Dr Stuart Taylor, for taking this initiative forward.

Update (05/12/15, 01:05): Well I didn’t have to wait long to find out which would be the next journal to join the club. Nature Chemistry’s editor Dr Stuart Cantrill crunched the numbers, posted the distribution and has written up his analysis on the Sceptical Chymist blog. He tells me there’ll be a link to his post from the journal homepage when it’s next updated.

Update (11/12/15, 11:26): Stuart Cantrill has clearly caught the citations distribution bug. He’s now also done a very nice comparative analysis of a selection of chemistry journals. It shows, as expected, that the distributions are all approximately of the same shape and helped to re-infornce the message that journals with low impact factors can be relied on to still have papers that attract large numbers of citations.

Posted in Open Access, Science | 4 Comments

Structural Biology: a beginner’s guide?

I got impatient waiting for my latest review article to come out, so here it is. The scheduled publication date has slipped twice now without the publisher getting in touch to explain why. The latest I’ve heard, after querying the editor who commissioned the piece, is that it will be out by the end of the month. But I’ve paid my £500 fee to make the work open access and don’t see any good reason to delay further.

My review, titled ‘Structural Biology: a century-long journey into an unseen world’, is a contribution to an upcoming issue of Interdisciplinary Science Reviews that will commemorate the centenary of the 1915 Nobel prize in physics awarded to William and Lawrence Bragg, the father and son team that first used X-rays to ‘see’ the atomic structure of matter. It traces the developments in structural biology that over the past 100 years – with and beyond X-rays – have revealed to us the fascinating molecular world that lies beneath our senses.

As befits an interdisciplinary journal, I tried to write my review for a general readership, which I hope to broaden further by making it available here. I doubt I have freed myself from all the bonds of the scholarly habit of writing but I hope the article might appeal to the interested amateur. As a taster, here are the opening paragraphs:

When Orville Wright took off in the Flyer on a grey morning in December 1903 and flew for all of 12 seconds across the sands near Kitty Hawk in North Carolina, little could he have suspected that by 1969 powered flight would land Neil Armstrong and Buzz Aldrin on the Moon. Humankind’s first foray onto another world remains for many people one of the greatest technological achievements of the 20th Century. But within the 66 years it took to get from Kitty Hawk to Tranquillity Base another equally remarkable technological – and scientific – journey took place, one that has brought us to a very different destination.

In 1912, in experiments initiated by Max von Laue in Germany and successfully analysed by William and Lawrence Bragg in England, X-rays were first used to peer into the atomic structure of crystalline matter. By the end of the 1950s X-ray crystallography had leapt from physics to chemistry to biology and the atomic architecture of DNA and several proteins had been revealed, giving us the first glimpses of a molecular landscape that was no less surprising and no less strange than the surface of the Moon. It had taken just five decades for structural biology to emerge as a fledgling discipline. In the five that have since elapsed the field has grown vigorously, thanks not only to developments in X ray crystallography but also to the emergence of complementary techniques that have used other physical phenomena to lift the veil on an unseen world – the atomic and molecular matrix of life.

If you want to know more, you can download the PDF (8.6 MB).

Update (11 Dec, 15:46): This article was finally published by Interdisciplinary Science Reviews at the end of November. It’s open access so you can now download the journal-fomatted version for free.

 

Posted in Protein Crystallography, Science | Tagged , , , , | 5 Comments

Lunacy and sanity

It’s less than 24 hours, so this still counts as a timely post.

I guess I had been primed because I had been thinking about it. But although I hadn’t set my alarm I found myself awake at 02:52 on Monday morning – I can still see the digital display – and so I got up, checked out the window that the moon was visible (it was – and already mostly eclipsed), dressed and hurried downstairs, grabbing my camera and binoculars on the way.

To my disappointment I realised after a quick search that my camera tripod has been left at the office and resigned myself to hand-held, or at least fence-supported, photography.

No matter. The point as to enjoy the moment. We drove to France to catch the solar eclipse in 1999, and I rose at dawn on a sunny day in June 2012 to witness the last transit of Venus of my lifetime. Having woken, I was determined to experience the eclipse of the supermoon.

What is it about these instances of celestial mechanics that are so appealing? I think it may be that they lay bare the machinery of the cosmos. They establish a connection that is understood, known – but otherwise invisible. They are special moments of conjunction – not in the astrologers’ fantastical nonsense – but in the sense that reality is (as we too often forget) astonishing and fantastic.

And so, last night, I watched the curve of the Earth slide across the face of the moon. I participated in the geometry of our place in the solar system, an experience made more real, but also more surreal, by the red, angry aspect the moon took on as it descended into full eclipse. For all the world, it might last night have been made of molten iron. In a magical and scientific moment the familiar was made unfamiliar.

But still and all it made perfect sense. The blood moon is no more than the product Newton’s cosmic clockwork, whatever Einstein might protest. As I am a cog in the wheel of my life, the machinations of which I discern less clearly than I might wish. But if every now and then I can apprehend the unfathomable mystery of the heavens by observing it’s predictable machinations, then perhaps there is also something to hold on to, for the sake of my own sanity.

Posted in Science | Comments Off on Lunacy and sanity

Ch-ch-ch-changes…

There’s a very real chance this could turn out to be an actual blogpost. In the original sense of the word: a web-log of what’s been happening.

Posts have been rather sparse on Reciprocal Space of late. That’s not for a want of words. It’s just that they have been expended elsewhere – over at the Guardian, in pieces about photographypreprints, the latest Science is Vital campaign (please join in) or Nicole Kidman’s performance as Rosalind Franklin in Anna Ziegler’s intelligent Photograph 51; or in the Times Higher writing about my term of office as Director of Undergraduate Studies (DUGS); or the article I wrote on the use and abuse of metrics (PDF) in research evaluation for the Indian journal, Current Science (my first foray onto the sub-continent); or in the latest paper to come from my lab, currently in revision but already available on the bioRxiv.

This dispersion of words away from my original blog home, which passed its seventh anniversary earlier this month by the way, is symptomatic of the calls of scientific duty, but also of the churn of events. Unintended outcomes that flow from the simple fact of having set out in a particular direction. I’m not complaining (or apologising – I said I would never apologise), just observing, though I didn’t intend for things to become so fallow at Reciprocal Space.

These are not the only changes in the offing. Yesterday evening I trudged home through a grey wet veil – Autumn’s warning to Summer that it is time to go. The sense of transition was reinforced this past weekend as we delivered our youngest to university. All three of our children have now flown the nest. My wife and I looked at each other. “What are we going to do now?”

Plenty, we hope. For myself I look forward to having more time for science. As I wrote in the Times Higher, at the end of this month I will step out of the heavy harness I have been wearing as DUGS. Truth be told, even as the contours of the new term take shape on the horizon, I already feel the administrative burden slipping from my shoulders. Not only will I not have to deal with the myriad tasks demanded by that role, but I will be entering a sabbatical year that I hope more than anything will give me a chance to think.

That thinking time is long overdue thanks to the familiar but irregular movements known to anyone absorbed by a life in science. Another research grant has just come to end, so another postdoc is leaving the lab and moving to pastures new. This is a good move for her but my task now is to refill the funding pot, a far from trivial endeavour. Money’s tight and the noises coming from the government are not encouraging. I need to dive once more into the waves of innovation and discovery. For crystallographers like myself, the spectacular rise of new techniques in cryo-electron microscopy is a challenge, but also an opportunity.

These moments of transition come around again and again. The scientific life is one of motion. The plates shift on the hot mantle. Looking up, I can see that the landscape has changed. But that’s OK: it is something to explore.

 

Posted in Scientific Life | Tagged , , , | 4 Comments

Pre-prints: just do it?

There is momentum building behind the adoption of pre-print servers in the life sciences. Ron Vale, a professor of cellular and molecular pharmacology at UCSF and Lasker Award winner, has just added a further powerful impulse to this movement in the form, appropriately, of a pre-print posted to the bioRxiv just a few days ago.

If you are a researcher and haven’t yet thought seriously about pre-prints, please read Vale’s article. It is thoughtful and accessible (and there is a funny section in which he imagines the response of modern-day reviewers to Watson and Crick’s 1953 paper on the structure of DNA). His reasoning is built on the concern that there has been a perceptible increase over the last thirty years in the amount of experimental data – and therefore work – required for PhD students to get their first major publication. Vale argues that this is a result of the increased competition within the life sciences, which is focused on restricted access to ‘top journals’ and is in turn due to the powerful hold that journal impact factors now have over people’s careers. Regulars readers will know that the problems with impact factors are a familiar topic on this blog (and may even be aware that their mis-use was one of the issues highlighted in The Metric Tide, the report of the HEFCE review of the use of metrics in research assessment that was published last week).

Michael Eisen has written a sympathetic critique of Vale’s paper. He takes some issue with the particulars of the arguments about increased data requirements but nevertheless espouses strong support for the drive – which he has long pursued himself – for more rapid forms of publication.

I won’t bother to rehearse Eisen’s critique since I think it is the bigger picture that warrants most attention. This bigger picture – the harmful effect of the chase after impact factors on the vitality and efficiency of scientific community – emerged as a central theme at the Royal Society meeting convened earlier this year to discuss the Future of Scholarly Scientific Communication. At that gathering I detected a palpable sense among attendees that the wider adoption of pre-print servers would be an effective and feasible way to improve the dissemination of research results; (for more background, see proposal 3 towards the bottom of my digest of the meeting).

Vale’s article does an excellent job of articulating the support for pre-prints that bloomed at the Royal Society meeting. I urge you again to read it. But for the tl;dr (“too long; didn’t read”) crowd, here’s the key section on the arguments for pre-prints (with my emphases in boldface)*:

1) Submission to a pre-print repository would allow a paper to be seen and evaluated by colleagues and search/grant committees immediately after its completion. This could enable trainees to apply for postdoctoral positions, grants, or jobs earlier than waiting for the final journal publication. A recent study of several journals found an average delay of ~7 months from acceptance to publication (33), but this is average depended upon the journal and the review/revision process can take longer on a case-by-case basis. Furthermore, this time does not take rejections into account and the potential need to “shop” for a journal that will publish the work.

2) A primary objective of a pre-print repository is to transmit scientific results more rapidly to the scientific community, which should appeal to funding agencies whose main objective is to catalyze new discoveries overall. Furthermore, authors receive faster and broader feedback on their work than occurs through peer review, which can help advance their own studies.

3) If widely adopted, a pre-print repository (which acts an umbrella to collect all scientific work and is not associated** with any specific journal) could have the welcoming effect of having colleagues read and evaluate scientific work well before it has been branded with a journal name. Physicists tend to rely less on journal impact factors for evaluation, in part, because they are used to reading and evaluating science posted on arXiv. Indeed, some major breakthroughs posted on arXiv were never published subsequently in a journal. The life science community needs to return to a culture of evaluating scientific merit from reading manuscripts, rather than basing judgment on where papers were published and hence outsourcing the career evaluation process to journals.

4) A pre-print repository may not solve the “amount of data” required for the next step of journal publication. However, it might lower the bar for shorter manuscripts to be posted and reach the community, even if an ensuing submission to a journal takes longer to develop.

5) A pre-print repository is good value in terms of impact and information transferred per dollar spent. Compared to operating a journal, the cost of running arXiv is low (~$800,000 per year), most of which comes from modest subscription payments from 175 institutions and a matching grant from the Simons Foundation. Unlike a journal, submissions to arXiv are free.

6) Future innovations and experiments in peer-to-peer communication and evaluation could be built around an open pre-print server. Indeed, such communications might provide additional information and thus aid journal-based peer review.

7) A pre-print server for biology represents a feasible action item, since the physicists/mathematicians have proof-of-principle that this system works and can co-exist with journals.

The last point is perhaps the most important. Publishing pre-prints is a feasible step. I have started to do it myself in the past year (partly motivated by deals offered by PeerJ) and it is a practice that I intend to continue.

But the key will be to get more and more life scientists to adopt the pre-print habit. Leadership on this from senior figures – academicians, fellows of the Royal Society, prize winners and the like – will help. Institutional support from funders and universities, by which I meant putting in place incentives for rapid communication of results, could also be important. The rest of us have to face this idea seriously and at the very least be willing to debate the pros and cons – I would welcome that discussion.

Or you could see the sense publishing preprints and just do it.

 

*Thanks to Ron Vale for permission to reproduce this section of his paper.

**In Vale’s article this phrase is ’not unassociated’ but I suspect that’s a typo.

Posted in Open Access, Science, Scientific Life | 16 Comments

Data not shown: time to distribute some common sense about impact factors

It’s that time of year when all clear-thinking people die a little inside: the latest set of journal impact factors has just been released.

Although there was an initial flurry of activity on Twitter last week when the 2015 Journal Citation Reports* were published by Thomson Reuters, it had died down by the weekend. You might be forgiven for thinking that the short-lived burst of interest means that the obsession with this damaging metric is on the wane. But this is just the calm before the storm. Soon enough there will be wave upon wave of adverts and emails from journals trumpeting their brand new impact factors all the way to the ridiculous third decimal place. So now is the time to act – and there is something very simple that we can all can do.

For journals, promotion of the impact factor makes a kind of sense since the number – a statistically dubious calculation of the mean number of citations that their papers have accumulated in the previous two years – provides an indicator of the average performance of the journal. It’s just good business: higher impact factors attract authors and readers.

But the invidious effects of the impact factor on the business of science are well-known and widely acknowledged. Its problems have been recounted in detail on this blog and elsewhere. I can particularly recommend Steve Royle’s recent dissection of the statistical deficiencies of this mis-measure of research.

There is no shortage of critiques but the impact factor has burrowed deep into the soul of science and is proving hard to shift. That was a recurrent theme of the recent Royal Society meeting on the Future of Scholarly Scientific Communication which, over four days, repeatedly circled back to the mis-application of impact factors as the perverse incentive that is at the root of problems with the evaluation of science and scientists, with reproducibility, with scientific fraud, and with the speed and cost of publishing research results. I touched on some of these issues in a recent blogpost about the meeting; (you can listen to recordings of the sessions or read a summary).

The Royal Society meeting might have considered the impact factor problem from all angles but  discovered once again – unfortunately – that there are no revolutionary solutions to be had.

The San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto are commendable steps in the right direction. Both are critical of the mis-use of impact factors and foster the adoption of alternative processes for assessment. But they are just steps.

That being said, steps are important. Especially so if the journey seems arduous.

Another important step was made shortly after the Royal Society meeting by the EMBO Journal and is one that gives us all an opportunity to act. Bernd Pulverer, chief editor of EMBO J., announced that the journal will from now on publish its annual citation distributions, which comprise the data on which the impact factor is based. This may appear to be merely a technical development but it marks an important move towards transparency that should help to dethrone the impact factor.

 

EMBO J. - Citation Distributions

 

The citation distribution for EMBO J. is highly skewed. It is dominated by a small number of papers that attract lots of citations and a large number that garner very few. The journal publishes many papers that attract only 0, 1 or 2 citations in a year and a few that have more than 40. This is not unusual – almost all journals will have similarly skewed distributions – but what it makes clear are the huge variations in citations that the papers in any given journal attract. And yet all will be ‘credited’ with the impact factor of the journal – around 10 in the case of EMBO J.

By publishing these distributions, the EMBO Journal is being commendably transparent about citations to its papers. It is not just a useful reminder that behind the simplicity of reducing journal performance to a single number is an enormous spread in the citations attracted by individual pieces of work. As Steve Royle’s excellent analysis reveals, the IF is a poor discriminator between journals and a dreadful one for papers. Publishing citation distributions therefore directs the attention of anyone who cares about doing evaluation properly back where it belongs: to the work itself. The practice ties in nicely with articles 4 and 5 of the Leiden Manifesto.

So what can you do? Simple: if in the next few weeks and months you come across an advert or email bragging about this or that journal’s impact factor, please contact them to ask why they are not showing the data on which the impact factor is based. Ask them why they are not following the example set by the EMBO Journal. Ask them why they think it is appropriate to reduce their journal to a single number, when they could be transparent about the full range of citations that their papers attract. Ask them why they are not showing the data that they rightly insist authors provide to back up the scientific claims in the papers they publish. Ask them why they won’t show the broader picture of journal performance. Ask them to help address the problem of perverse incentives in scientific publishing.

*The title is somewhat confusing since the 2015 JCR contains the impact factors calculated for 2014.

Posted in Open Access | 10 Comments

Can we amend the laws of scholarly publication?

As part of its celebrations to mark the 350th anniversary of the publication of Philosophical Transactions, the world’s longest-running scientific journal, the Royal Society has organised a conference to examine ‘The Future of Scholarly Scientific Communication’. The first half of the meeting, held over two days last week, sought to identify the key issues in the current landscape of scholarly communication and then focused on ways to improve peer review. In the second half, which will be held on 5-6th May, the participants will examine the problem of reproducibility before turning attention to the future of the journal article and the economics of publishing.

The luminaries who assembled in the Royal Society’s commodious headquarters in Carlton House Terrace included publishers, academics, representatives from funding agencies and learned societies, several journalists, and a smattering of FRS’s and Nobel laureates – all well versed in the matter at hand. A report of the deliberations will be published in due course but I wanted to work through a central problem that surfaced repeatedly in the meeting last week.

Tim Berners-Lee – portrait at the Royal Society, London

The Royal Society’s portrait of Tim-Berners-Lee, without whom we wouldn’t be having this conversation.

 

The central problem

The discussion on the first day – vividly live-blogged by Mike Taylor – was an attempt to define the challenges facing scholarly publishing in the 21st century and covered territory that will be familiar to anyone who has read up on open access. The debate kept circling back to the same basic point: the over-weening influence of impact factors and prestige journals, which have academics and publishers locked in an unhealthy relationship that seems increasingly unable to harness the opportunities presented by a digital world. It turns out that pretty much everyone can articulate the present difficulties – the hard bit is finding workable solutions.

As I sat at the back of the room absorbing the proceedings, it struck me that the crux of the problem could be distilled into two laws of publishing. During the meeting I put them out on Twitter in a somewhat light-hearted way but there’s a serious point here.

First law: To get maximum credit for your work, publish in a journal with a high impact factor.

Second law: For maximum speed of publication, choose a well-run mega-journal with ‘objective’ peer review* or, better yet, a preprint repository.

The problem with these laws is that they are incompatible and have given rise to unintended consequences, the central one being the creation of a system in which incentives for researchers are lashed to a slow and unwieldy process of publication. As presently configured academic publication is a competitive business in which journals vie with one another for the best quality science, or at least for the papers they think will boost their impact factors since that is one of the best ways to build a brand. As a result they have become the locus of the competition between researchers, who are now obsessed with the prestige points awarded by journals as the means to win jobs, promotion or funding. The ensuing chase for publication in ‘top journals’ retards publication because researchers are willing to submit to multiple rounds of submission, rejection and re-submission as they work their way down the hierarchy of titles.

There are other adverse effects (which will be considered below) but the solution to the problem has to be to weaken the coupling between the location of publication and the assessment of research and researchers.

That is easier said than done. The coupling cannot be broken in a free market because of the competition between journals, which is of course fuelled by the competition between researchers. The competitiveness inherent in the system is not necessarily a bad thing since it can act as a spur for scientists to do their best work. The problem is that publication in a particular journal is too easily seen as the final judgement on any given paper. If it’s in Nature (to choose a prominent example), it must be world-class – such is the argument that I have heard repeatedly in common rooms, grant panels and hiring committees. This line of reasoning is seductive because there is some truth in it – Nature publishes a lot of great science. Unfortunately for too many people the argument contains enough truth for them to ignore the fine detail and from there it is a short step to take the journal impact factor as a proxy for the quality of a given piece of research. But it is vital to tease out and neutralise all the problematic aspects of our over-reliance on journal prestige. There are big wins to be had for scholarly publication and for researcher evaluation if we succeed.

The simplistic but widespread view that good journals publish good papers is some distance from being the whole truth of the matter. Some papers get into Nature or Science because they catch a trendy wave but then fail the test of time – arsenic life, anyone? That’s not to shout down journal predilections for trendy topics, which seem to be inevitable amid the ebb and flow of scientific tides but we should be mindful that they occur and calibrate our expectations accordingly. Some papers are published in top journals because they come from a lab at a prestigious university or one that has a good track record of publication and so may benefit from a type of hindsight bias among editors and reviewers**. Some get in because exaggerated claims or errors or fraud are not picked up in review. Again, that’s not necessarily the fault of editors or reviewers – especially in cases of fraud – but we need to acknowledge that a system that determines rewards on the basis of publication in certain journals will foster poor practice and fraudulent behaviour. This explains why retraction rates correlate so strongly with impact factor.

Just as important, many papers are rejected that, with a different set of reviewers or a different editor might well have made the cut. Clearly the number of reviewers has to be restricted to keep the task of pre-publication review manageable*** but the stochastic nature of the decision-making process that results from relying on a small set of judgements is too often overlooked.

The selectivity problem is exacerbated in top journals which restrict entry for logistical reasons – the number of printed pages – and for brand protection. Nature, for example, only has room for  8% of the manuscripts it receives. Such practices presently act as an arbitrary and unfair filter on the careers of scientists because the reification of journal prestige or impact factor shapes presumptions and behaviour. Those who fail the cut on a particular day are likely to be disadvantaged in any downstream judgments. The mark of achievement is no longer necessarily ‘a great piece of science’ but ‘a paper in Nature, Science, Cell etc’. I’ll say it again: these can often be the same thing. But not in every case and best practice has got to be to always, always, always evaluate the paper and not the journal it is published in.

No-one designed scholarly publication to have all these in-built flaws but the evident fact of the matter, widely agreed at the meeting, is that it has become a slow and expensive filtering service that distorts researcher evaluation and is no longer doing an optimal job of disseminating scientific results.

As you see, I too can articulate the problem. But what about the solutions?

 

Possible ways forward

For the first morning of the meeting I felt myself slipping into a funk of despair as the discussion orbited the black hole of the impact factor. It has become so familiar in the cosmology of scholarly publication that people are resigned to its incessant pull.

However, that dark mood was slowly dispersed as occasional voices spoke insistently to press for answers. For me the bright spot of the conference was realising that the best way forward is likely to be in small pragmatic steps. Every so often one of these may turn out to be revolutionary. PLOS’s invention of the mega-journal based on ‘objective peer review’ was one such case. So here are three more steps that I think are feasible, if all the key players in scholarly communication can be persuaded to really take on board the wrongness of journal impact factors as a measure of quality.

ONE: Academics and institutions (universities, learned societies, funders and publishers) should sign up to the principles laid down in the San Francisco Declaration on Research Assessment (DORA). These seek to shift attention away from impact factors and to foster schemes of research assessment that are more broadly based and recognise the full spectrum of researcher achievements. Ideas for how to put such schemes in practice are available on the DORA web-site.

It is a real shame – and somewhat telling of the powerful grip of impact factors on scientific culture – that only a handful of UK universities have so far signed up to DORA (Sussex, UCL, Edinburgh and Manchester, at time of writing). I hope more will step up to their responsibilities after reading the warnings about the mis-use of the impact factor in the Leiden Manifesto on research metrics that was published in Nature just last week. I would like to think it will become a landmark document.

TWO: All journals with impact factors should publish the citation distributions that they are based on. The particular (and questionable) method by which the impact factor is calculated – it is an arithmetic mean of a highly skewed distribution, dominated in all journals by the small minority of papers that attract large numbers of citations – is not widely appreciated by academics. In the interests transparency, and to take some of the shine off the impact factor, it makes sense for journals to show the data on which they are based, and to allow them to be discussed and re-analysed. Nature Materials has shown that this can be done. I invite journals that might object to a move that would improve authors’ understanding and help to promote fairer methods of researcher assessment to explain their reasoning.

The difficulty with this proposal, as pointed out at the meeting, is that the data are amassed by Thomson-Reuters who have so far refused to release them. This practice can now be declared contrary to the 4th principle of the Leiden manifesto, which holds that data collection and analytical processes should be open, transparent and simple. If Thomson-Reuters are not interested in being part of the solution to the problem of impact factors, they have to be side-lined. Perhaps there are ways to crowd-source the task of assembling journal citation distributions? I would welcome suggestions from tech-savvy readers as to how this might be achieved in practice.

THREE: The use of open access pre-print servers should be encouraged. These act as repositories for manuscripts that have yet to be submitted to a journal for peer review and have the potential to short-circuit the delays in publication caused by the chase for journal impact factors.

I’m not familiar with the situation in chemistry but know that preprint servers have come rather late to the life sciences, inspired by the arXiv that has been operational in many sub-disciplines in physics and maths since 1991. bioRxiv (pronounced ‘bio-archive’) has 1300 preprints, while the PeerJ Preprints tally has just passed 1000. These are low numbers – the arXiv has over a million depositions – but they are growing and there was such widespread support for preprints at the Royal Society meeting that I began to get a sense that they could be transformative.

Preprints speed up the dissemination of research results (albeit in an early form) – and that is likely to be healthy for the progress of science. Uptake will increase as more researchers discover that fewer and fewer journals disallow submissions that have been posted as pre-prints. Nay-sayers might contend that there are risks in disseminating results that have not been tested or refined in peer-review but such risks can be mitigated by open commentary. As noted in the meeting by bioRxiv’s Richard Sever, preprints tend to attract more comments than published papers because commenters feel that they can make a positive contribution to shaping a paper before it is published in its final form. Such open review practices have been harnessed successfully by the innovative open-access journal Atmospheric Chemistry and Physics.

Preprints may not solve the problem of impact factors, since the expectation is that most or all will eventually submitted for competitive publication in a journal; however, they could offer some mitigation of their worst effects, especially if authors were to be recognised and rewarded for the speed at which they make their results available. That’s something for funders and institutions to think about. Moreover, because they have digital object identifiers and are citable objects, they provide a rapid route for establishing the priority of a discovery and a cost-effective way to publish negative results.

 

Finally…

There is more to say and more dimensions of the arguments made here to tease out but I will leave things there for now. All of the above may be no more than the optimistic after-glow of the meeting but I submit my three proposals for serious consideration. I hasten to add that none of them are original, and I know that I’ve written before about pretty much everything discussed above – but we have to keep these issues alive. To paraphrase Beckett, it’s our only hope of finding ways to fail better.

 

Notes

*’Objective peer review’ was the term adopted at the meeting to describe the type of peer review pioneered by PLOS ONE, which will accept papers that report an original piece of science that has been performed competently. The impact or future significance of the work are not considered. I am not comfortable with the term since this type of review is clearly not ‘objective’ but it will have to do for now. 

**Nature has recently introduced a double-blind review to combat this. However, it is optional for authors and I find it hard to imagine a group that has a history of publication in Nature opting for it.

***Richard Smith (formerly editor of the BMJ), who spoke powerfully at the meeting to criticise peer review, and proponents of post-publication peer review will no doubt take issue here. 

 

Posted in Open Access, Scientific Life | 16 Comments

Open access: a national licence is not the answer

Open Access: Is a national licence the answer?” is a proposal by David Price and Sarah Chaytor of University College London for a mechanism to provide full access to everyone within the UK to all published research. It was published on 31 March 2015 by the Higher Education Policy Institute (HEPI) whose director, Nick Hillman, wrote the foreword. 

The proposal is presented as a HEPI yellow “occasional paper”, so it is designed to be provocative and to stimulate debate rather than being, as Hillman writes, “a fully-formed ready-to-bake policy”. It is certainly provocative but so far there hasn’t been much debate. The paper provoked an angry, heartfelt riposte from from Mike Taylor and a satirical one from David Kernohan. Hillman responded by accusing Taylor of wanting “to ‘close’ down debate about the different options without fully engaging”, but part of the problem is that the proposal itself does not fully engage with the complexities of the issue at hand, and this has made it difficult to grapple with. In my own mind the national licence idea has provoked so many thoughts that I have struggled to assemble them coherently but, in the interests of a fuller debate, let me have a go. I hope to amplify some of the key issues but am aware that there are further aspects that should be turned over for consideration. I confess that the issue of open access stirs the heart as well as the head, which can make it tricky to discipline arguments. This post is therefore rather long, so my apologies in advance.

 

The proposal

The principal aim of the national licence is to provide access the research literature to stakeholders outside academia, since UK academics are already presumed to have excellent access (p3 – page numbers refer to the PDF of the proposal document). The idea is that some overarching body – perhaps JISC –  should negotiate the terms of a UK licence on behalf of the major stakeholders, who are listed as “UK higher education institutions, SMEs (small and medium enterprises), UK medical institutions and NHS staff, charitable funders of research, public libraries and representatives of independent researchers.” (p14)

This is an entirely laudable goal, sharing many of the aims that the open access movement has sought to promote.

But the devil is in the detail and the problem here is that there is a troublesome lack of detail. In my view the proposal is built on questionable premises and argues from a selective and sometimes erroneous presentation of the evidence. It promotes the notion that a national licence is likely be an efficient and cost-saving mechanism for providing access, but provides scant evidence to support that view. To its credit the latter part of the document makes some attempt to identify the challenges and risks of the proposal, but even here the analysis is incomplete. I agree with Hilllman that the proposal is some way short of being ‘fully-formed’.

 

Questionable premises

The motivating idea behind the idea of a national licence is that “the UK is offering global access to its own research via the gold route with no reciprocal offering from most other countries, including key competitors.” This statement is repeated twice in the document with the qualification ‘most’ (p4, p10) and once without qualification (p30). It is given twice without qualification in the HEPI blog post announcing the publication of the proposal.

However, at no point in the 30-page document making the case for a national licence do the authors choose to flesh out how exactly the UK is paying to give its research away for free and receiving next to nothing in return. This is an unfortunate omission because, as a result, their proposal mis-represents UK policy (by not clarifying what it is) and overlooks the evidence for reciprocity.

Throughout the document, Price and Chaytor refer to the ‘UK gold open access policy’ (or words to that effect) but as I am sure they are well aware, UK policy is not purely gold. To be clear, researchers in receipt of grant funding from one of the UK Research Councils are subject to RCUK policy which has a preference for gold OA publication  (immediate access via a journal, often subject to payment of an Article Processing Charge (APC)) but also permits green OA (access via a repository, often following an embargo, currently 6-24 months depending on funder). RCUK guidance on the policy makes it clear that “the choice of route to Open Access remains with the researchers and their research organisations”.

In addition to the RCUK policy, from 2016 all UK researchers in higher education institutions overseen by HEFCE will be required to ensure that their papers are made available via a repository. Therefore, while it is true that the UK is promoting gold OA and has carved out funds from the research budget to pay for these, but there is nevertheless a strong green OA flavour to UK policy.

The notion of lack of reciprocity also needs to be challenged. The UK may be in a minority in having policy that prefers gold OA but it is not alone. Norway has recently announced a similar policy and it is worth pointing out that major international funders and research organisations have also developed gold-favouring OA policies, including the Wellcome Trust, CERN, the World Health Organisation, the Howard Hughes Medical Institute and the Gates Foundation. It should also be borne in mind that most of the research-active nation around the world have or are developing and strengthening green OA policies that are effectively increasing the proportion of research that is free to read online. In many developed nations, even where mandates are for green OA at minimum, researchers often choose to make their research available via gold OA. The evidence for this is not hard to find: in a quick analysis of papers from a selection of PLOS journals (via a PubMed search for country of affiliation), I found that authors in Germany publish comparable numbers of papers as their UK counterparts, while US-affiliated authors publish 2-4 times as many. A 2012 study of the worldwide availability of research showed that there is strong growth in OA in all parts of the world, especially in the EU, Asia and North America (see Fig. 3).  The notion that there is little effective reciprocity on OA from the international community of researchers doesn’t stand up to scrutiny.

Even if the case for lack of reciprocity had held, it is simplistic to argue that a gold-favouring OA policy risks the future economic well-being of the UK. Mere access to research is not sufficient for stoking the engines of innovation and economic growth, though of course it will help. As anyone who has looked into the links between research and economic development must realise, the interactions are many, diverse, non-linear and interconnected (e.g. see this post from 2010). The UK has not emerged as a strong developed nation because it had access to research, and nor does its future economic strength depend simplistically on continued access. To develop as a knowledge-based economy, one needs an educated population, strong universities, good contacts between universities and industry, effective funding mechanisms to support the difficult transition mechanism from the lab to the marketplace. The UK already does this pretty well – though a reversal of the decline in the R&D budget wouldn’t go amiss. But it also stands to benefit as other nations develop, which is one of the reasons for the establishment and expansion of the EU, and for the UK’s commitment to overseas aid (now commendably pegged at 0.7% of GDP). The country rightly sees itself as a member of a community of nations. Consistent with that view, should its commitment to OA not be envisaged as part of a worthwhile global project – a rising tide that aims to float all boats?

 

Questionable analyses

The discussion of the problem of reciprocity is not the only place in the proposal where the analysis lacks sufficient depth. I came across several instances of selective or contradictory presentation of the evidence.

In discussion the costs of gold OA Price and Chaytor write that “Extensive economic modelling – in a report funded by Jisc Collections and published by the Open Access Implementation Group – suggests green, rather than gold, open access is the cheapest option for universities.” The following sentences provide the important clarification that this analysis by Alma Swan and John Houghton refers to costs within a transition period from the present day to a fully gold OA publishing landscape. But it is odd that the authors then omit to mention another, equally important study by Swan and Houghton, which predicted that gold OA would ultimately enable a system of research publishing that was cheaper than present arrangements.

In addressing the problem of ‘double-dipping’ – the problem whereby publishers of subscription journals that also carry OA papers (the so-called hybrid OA option) are effectively remunerated twice for public same publication – Price and Chaytor write that “this issue is being successfully addressed by Jisc Collections through negotiations with publishers for offsets.” This is a confusingly optimistic assertion given that they go on to concede that “not all publishers have yet engaged with this process” (indeed, Elsevier refuses to admit that it occurs) and cannot therefore predict whether offsets will have any impact.

The proposal rightly decries the limitations on access to research “in an age when 78% of properties are able to receive superfast broadband and some 90% of the population are online”. And yet it goes on to cite the publisher-led Access to Research initiative as an exemplar of “the logic of a national licence concept”. There’s a certain lack of logic here which isn’t explored in the proposal. The Access to Research initiative insists that users leave their homes and offices and travel to local libraries to access research via dedicated computer terminals. The terms and conditions are severe and debilitating: users are not allowed to download or make digital copies of the research that they access and must promise only to use it for private study or non-commercial research. This is hardy a template for leveraging access for a connected nation, nor for enabling its SMEs to access research. To be fair Price and Chaytor are at least proposing a system of access that would surmount the conceptual failings of Access to Research. However, publishers’ belief that library-based access to research is an effective solution in 21st Century illuminates an instinct for control that still rubs abrasively against the opportunities of the wired world. A national licence, if it were ever to materialise, wouldn’t come cheap.

Arguably, by highlighting the minor issues above I am missing central thrust of the case for a national licence. Perhaps so, but the presence of these faulty links in the chain of argument is indicative of a lack of rigour in the construction of that case. This is complex territory. If we are going to have a serious debate about policy, we need a careful consideration of all the relevant details. In any case, more serious problems emerge when one considers how the proposal might be made to work.

 

Technicalities of the proposal

The authors grapple with some of the technical challenges engendered by the proposal in the latter part of the document (p19-23). But while some of the problems are outlined, they are not dealt with effectively or completely.

The proposal envisages a national body being tasked with the job of negotiating on behalf of the stakeholders listed above (universities, SMEs etc.) with all the individual publishing companies. Funding for the licence should come from “a combination of existing sources of central government higher education funding (via Research Councils and the higher education Funding Councils), some allocation of funds currently dedicated to facilitate closer co-operation between industry and academia, the National Institute of Health Research (NIHR) or NHS funding and contributions from business and Innovate UK.” (p15)

The authors make it plain that the negotiations to determine an agreed price are likely to be difficult (p17-19). It think that understates the problem. It is not at all clear that negotiations would be feasible, give the number and diversity of organisations on both sides. How many publishers would the UK have to negotiate in order to achieve full coverage? What happens if some of them decline to participate? Is there an organisation that can represent the interests of SMEs and negotiate on their behalf? What are the likely costs of implementing an effective security system to control access to UK residents only? Will it be possible to agree a fair price for all the various publishers and stakeholders? How often would the price have to be re-negotiated? Given that in the last round of big-deal subscription negotiations RLUK (acting on behalf of leading universities) had to threaten wholesale cancellations of subscriptions to get Elsevier and WIley-Backwell to play ball, the prospect of a successful conclusion of a much more complex deal seems remote.

The proposal repeatedly claims that a national licence would save money. On page 19 it is stated that “The introduction of a national licence is likely to deliver some efficiency and cost savings.” However, no attempt has been made to estimate the costs of bringing in a licence, or the savings that are predicted to accrue. That is probably sensible, given the difficulty of the task and the great uncertainty as to whether it is even possible, but it hardly bolsters the case to repeat the claim three times in the document without any serious evaluation of its substance. The best the authors can do is ask for more work to establish a robust cost-benefit analysis (p20).

There are other risks too, not discussed in the proposal. A national licence would lock in the advantages currently enjoyed by subscription publishers, who would presumably seek to defend price points that earn profit margins in excess of 30%. It would stifle the burgeoning market in open access journals by locking up funding in the biggest subscription deal ever imagined. It comes as no surprise that the Publishers Association’s CEO, Richard Mollet is thanked “for doing much of the early development” on the proposal for a national licence.

Price and Chaytor claim that a national licence represents an opportunity for the UK to show leadership on research dissemination (p8) and to transition to full open gold open access (p23), but no details on how the proposal would achieve these goals are given. In my view the opposite is likely to be true: a national licence would in fact hinder the development of world-wide open access. If successfully implemented, UK researchers with access to the world’s research via a national licence would start to wonder why they should bother to make their own research open access, either through green or gold routes. RCUK and HEFCE would come under pressure to terminate their OA policies so as to save on the unnecessary costs of paying APCs or running repositories. Such a prospect seems to have been envisaged by Price and Chaytor, since they argue that a national licence would save money by “removing what is effectively a subsidy for other countries to access UK research output” (p19). Far from exhibiting leadership, a national licence would see the UK withdrawing from the supra-national community that has developed a global vision for access to research.

And that is the most dispiriting thing about this proposal. It comes across a rear-guard action that is out of tune with the times. Mike Taylor put it more pithily: “It’s not open access by any existing definition of the term.” Even David Willetts warned publishers that, in the digital age, seeking to defend existing models was the “wrong battle to fight”. The proposal for a national licence serves only to highlight the failure of the subscription model to address current needs for rapid, free access to research. A national licence is an idea that sees the UK hunkering down to protect its own interests at a time when people across the world are working on an international licence to enable research access for everyone.

 

Counter-proposals

The proposal for a UK national licence at least has the merit of refocusing thinking on some of the difficulties with OA policies, which are by no means problem-free. There are cost implications of the present UK policy that need to be monitored and brought down wherever possible. Publishers can help by eliminating double-dipping, by formulating plans to use hybrid OA funds to flip subscription journals to OA, by ensuring that papers are made OA when APCs are paid and by working with universities to smooth the implementation of the HEFCE policy. Academics can help too by stepping up to the responsibilities that come with public funding – rapid dissemination of their results at a value-for-money price – and by addressing the deep-seated cultural problems that have arisen through the linkage of assessment with journal impact factors.

None of this is easy so it is important to subject the whole process of improving access to continual and informed debate – especially since the goal of free access within and beyond the research community is such a desirable one. I hope that might continue in the comments beneath this post.

In his foreword Hillman challenges anyone who disagrees with the idea of a national licence is “to propose other ways to ensure the UK continues to punch above its weight in both academic research and academic publishing.” I would like to try to meet that challenge. As a UK-based researcher I am keen to ensure that Britain continues to perform at a world-class level – indeed I have campaigned to make the case for public investment in research as part of Science is Vital and CaSE. I also want to maintain a healthy academic publishing industry, but one that thrives on competition to ensure quality of service and value for money for researchers.

I believe that the best ways to achieve this are by working to promote the world-wide OA project that is already in train. The UK showed bold leadership on OA in the wake of the Finch report by announcing a gold-favouring policy. That may not have triggered many imitators, perhaps not surprisingly in the midst of a global economic crisis, but it has certainly helped to propel discussion on the topic across the world. The sooner we can get to a fully OA world, the better it will be for the UK economy, which is already in a strong position to absorb and make use of research information released from behind paywalls (though no doubt more could be done to bolster innovation policy). We should therefore seek to maintain UK leadership in the OA project. To that end, my specific proposals are as follows:

  • The UK should increase investment in R&D to reverse the decline that has occurred the last parliament and to maintain the research infrastructure and absorptive capacity needed to develop as a knowledge-based economy.
  • To stimulate the market in OA journals, the UK should follow Norway in preventing funds to be used to pay APCs for hybrid OA (which has been demonstrated to be substantially more expensive than pure OA).
  • The business of publication of research in particular journals needs to be decoupled from the business of research or researcher assessment. Journals have become the de facto locus of competition between researchers for prestige and funding. While there is evidently some value in journal selectivity helping to bring attention to research results, which acts as a stimulus to researchers to do their best work, there are also significant costs associated with the pernicious practice of journal-based assessment. It degrades the assessment process. It promotes fraud. It slows down the dissemination of results as researchers regularly work their way down the ladder of journal prestige, submitting and re-submitting their manuscripts in search of the best venue that will have them. If we can figure out a post-publication mechanism for rewarding research quality that is not based on journal brand or prestige, we could accelerate publication and reduce costs (since highly selective journals charger higher APCs). I don’t for a moment underestimate the cultural and economic challenges that this idea presents to academics and publishers but if we value effective open access (and public confidence in the research enterprise), we need to try. As a first step,  research funders should incentivise UK HEIs to sign up to the San Francisco Declaration on Research Assessment – or any equivalent statement of principles.
  • Part of the present difficulty is that researchers have been shielded from publication costs because subscriptions were negotiated by university librarians and sometimes hidden by confidentiality agreements imposed by publishers. In enacting current policy, measures should therefore be taken to ensure that researchers are exposed to cost-benefit decisions in choosing where to publish publicly-funded research. This will foster healthy competition on price and quality of service within the UK.
  • The UK government should maintain a prominent role in working with the international community to ensure that effective and workable OA mandates are instituted globally. It can do that by continuing to promote OA within the UK since it is best to lead by example.

Update (14 Apr 2015): To keep track of all commentary on this proposal, I will list here the blog posts that have discussed it:

  1. Mike Taylor (1 Apr): Heaven protect us from a “UK national licence”
  2. David Kernohan (2 Apr): A local licence for Henley (a response to @HEPI_news)
  3. Adam Tickell, Michael Jubb (12 Apr): A national licence would set back the Open Access cause

 

Posted in Open Access | Tagged , | 10 Comments

Open letter to the Publishers Association: please amend your open access decision tree

Dear Publishers Association

I ask that you amend the open access decision tree you created for incorporation into the guidance notes accompanying the Open Access (OA) policy announced by Research Councils UK (RCUK) in 2013. It may seem odd to ask for a correction so late in the day but my reasons for doing so are two-fold.

First, the Publishers Association (PA) decision tree has been problematic from the outset because it does not properly represent the RCUK OA policy. In particular, it suggests that if authors have access to funds from the RCUK to pay publisher’s article processing charges, they are required to publish by the gold OA route (see diagram below). This contradicts the RCUK policy and guidance (PDF) which states that “the choice of route to Open Access remains with the researchers and their research organisations” (see page 6).

PA OA Decision tree - annotated

I hoped that this message would have become clearly established in the past two years and that the faulty PA decision tree might therefore have fallen into disuse. However, this appears not to be the case since Gemma Hersh, a policy director at Elsevier, referred to it last week on Twitter as ‘the crucial tree underpinning RCUK’s policy’. When I queried the accuracy of this statement in light of the fact that the tree obscures the choice accorded by RCUK policy to authors, she was emphatic in defending the view that “it’s how the policy works in practice”. It is a matter of some concern that some publishers are spreading information about the RCUK OA policy that is not completely correct.

The second motive is the publication last week of the report of the first review of the RCUK’s open access policy (available as a PDF), which was chaired by Professor Sir Bob Burgess and also had the PA’s chief executive, Richard Mollet, as a member. As you will be aware, this review has made two recommendations that are important to the matter in hand.

The first (2.1) is that:

Further attention to communications surrounding the RCUK policy, in dialogue with the research communities, publishers and HEIs would help ease confusion and generate better awareness of the expectations of the policy.

I’m sure you agree this is sensible. Indeed, I am glad to see that the PA has highlighted the review’s point about the need for clarity in communications in its own summary of the review. As everyone who has worked on open access is aware, the policy landscape is complex. It is vital that messages to researchers are free from confusion.

The second recommendation (2.4) is that:

In communication during the transition period, the mixed model approach to open access is promoted to ensure that researchers are aware that they have a choice of how to publish.

The mixed model – that gold and green routes are both open and that the choice of which route to take is down to authors – is a central plank of the RCUK policy, but this is not communicated by the original version of the PA decision tree.

To help clarify matters, I have taken the liberty of creating a modified version of your decision tree that incorporates the requisite element of author choice (see below – a PowerPoint version can be downloaded here). I ask that it be used in place of the original, erroneous diagram and would be grateful if you could share it with your members.

PA OA Decision tree - revised

If you disagree with my interpretation of RCUK OA policy, I would be happy to discuss.

Yours faithfully,

Stephen Curry

 

Update (00:12, 31-3-15): I have made one further adjustment to the tree to reflect the fact that Medical Research Council embargo periods are restricted to 6 months. Thanks to @GeraldineCS for pointing this out.

Posted in Open Access | Tagged , , | 9 Comments