Combining preprints and post-publication peer review: a new (big) deal?

Stimulated, I believe, by Ron Vale’s call to preprints last year, various luminaries from the world of science and science publishing will be gathering in Maryland at the headquarters of the Howard Hughes Medical Institute (HHMI) later this month to discuss the way forward.

The meeting, called Accelerating Science and Publication in Biology – ASAPbio for short – aims to focus discussion on:

preprints and the role that they might play in catalysing scientific discovery, facilitating career advancement, and improving the culture of communication within the biology community.

Admirably the organisers are hoping to get beyond just talking about the well-known problems of scientific publishing:

The meeting will identify actionable next steps that emerge around areas of consensus, and the organizing committee and other interested participants will be involved in subsequent follow-through.

So it’s a serious affair. And the work has already started. In advance of the meeting, Mike Eisen and Leslie Vosshall have uploaded a commentary proposing a mechanism for coupling preprints and post-publication peer review. It’s short and to-the-point and well worth reading. The central feature of their proposed system is that authors would post preprints that could then be peer-reviewed along two different tracks:

Track 1: Organized review in which groups, such as scientific societies or self-assembling sets of researchers, representing fields or areas of interest arrange for the review of papers they believe to be relevant to researchers in their field. They could either directly solicit reviewers or invite members of their group to submit reviews, and would publish the results of these reviews in a standardized format. These groups would be evaluated by a coalition of funding agencies, libraries, universities, and other parties according to a set of commonly agreed upon standards, akin to the screening that is done for traditional journals at PubMed.

Track 2: Individually submitted reviews from anyone who has read the paper. These reviews would use the same format as organized reviews, and would become part of the permanent record of the paper. Ideally, we want everyone who reads a paper carefully to offer their view of its validity, audience, and impact. To ensure that the system is not corrupted, individually submitted reviews would be screened for appropriateness, conflicts of interest, and other problems, and there would be mechanisms to adjudicate complaints about submitted reviews.

Authors would have the ability at any time to respond to reviews and to submit revised versions of their manuscript.

This is an interesting and provocative piece of work but I have some questions that I would like to lob into the discussion.

  1. Why would scientific societies, many of which have healthy income streams from journal publishing, contribute to a system that would lead to their demise if adopted widely? Could one create incentives for them to participate or should they be sacrificed for the greater good of research?
  2. Who forms the “coalition” mentioned in the proposal that has the task of approving review groups? This coalition needs to be authoritative for the system to work, but universities are every bit as invested in the current journal system (and JIFs) as researchers. And funding agencies are reluctant to dictate to researchers the routes through which they may publish.
  3. This new scheme does not guarantee that peer review will occur. Under the present system all competent researchers can get their work reviewed – and be reasonably assured  that it will be published. What would tempt them away from journals if the risks of not being reviewed were perceived as tangible?

Journals such as Atmospheric Chemistry and Physics and F1000Research are using more formal types of post-publication peer review – indeed Vitek Tracz and Rebecca Lawrence of F1000 outline its principal features in another ASAPbio commentary (also worth a read). What they offer is a guarantee that review will be conducted and that is something that I think matters to most researchers. I wonder are these new journal formats a more attractive stepping stone away from the present JIF-infected dispensation?

I would be interested to hear other’s responses to both the commentaries and my questions above. It’s great to see meetings like this taking place and I very much hope that a set of actionable points will emerge.

The meeting will be streamed online to enable as many people as possible to follow proceedings, though I hope some of the attendees will take it upon themselves to write pithy summaries.

Related posts:

Pre-prints: Just do it? (Reciprocal Space)

Peer review, pre-prints and the speed of science (The Guardian)

Posted in Academic publishing, Open Access | Tagged , | 2 Comments

Open access and public engagement: I need your help

Dear Reader,

I would appreciate your help.

I am working on a chapter for a book on openness within science (to be published by Manchester University Press). The book is part of the ‘Making Science Public’ program run by Prof Brigitte Nerlich at Nottingham University and aims to take a critical look at the dilemmas of open science. In my chapter I want to explore how open access publishing has impacted the relationship between the public (in its various forms and groupings) and researchers.

As someone who has supported open access from within the life sciences, I have often expressed the hope that increasing the volume of the research literature accessible to the public might stimulate a public-side demand for more information from researchers, or clearer reports (e.g. lay summaries) – or might in other ways change the dynamic of interaction with the ‘experts’. I’m interested in the question of whether open access has empowered members of the public (broadly defined) in any significant way.

This is likely to be a minority interest among the general public but I imagine that there are special interest groups – patient organisations and campaign groups among them – that have a strong interest in the research literature.

I would very much like to find specific examples. I have made a number of directed enquiries but then thought I would use my blog to widen the net, particularly since the deadline is – what’s the word? – impending.

So, does anyone out there know of cases where open access to the research literature has stimulated contact with the research community? I would be particularly interested in instances where the contact has altered the course of research or clinical practice or public policy.

I would be grateful for all comments and suggestions. Please feel free to comment below or email me via the link on the left hand side of my university web page.

Many thanks,

Stephen

 

Posted in Academic publishing, Science | 10 Comments

Anatomy of a blog post on the anatomy of a scientific discovery

At the risk of getting uber-meta, here is a blog post about writing my latest blog post at the Guardian. This was an account of a scientific discovery, albeit a minor one, that occurred during the process of shepherding the latest paper from my lab to publication.

Why write about writing this post? Because maybe it will help others, and maybe it will help me to think it through.

I should know better by now but I underestimated how hard it would be. To tell the story, my blog post had to dig into the molecular details of our analysis of the mechanism of the initiation of translation of the RNA genome that is delivered to infected cells by the norovirus. But, as I have discovered when tackling molecular topics in the past, you can’t start digging until the ground is prepared, and all along the way you have to keep stopping to explain this or that piece of molecular jargon. There is a constant battle between narrative momentum and the desire to keep the reader in the picture, without insulting their intelligence.

Writing about DNA is a piece of cake compared to writing about proteins. It is safe to assume that most readers have an image in their minds eye of the double-helix and a grasp of the idea that it contains coded instructions written in a sequence of bases. It doesn’t really matter if they don’t know what bases are. Most, I believe, are aware that there are four of them: A, C, G and T. Proteins are more complex and sprout jargon from every feature – the peptide bonds that string their constituent amino acids into a polypeptide chain, which starts with an N terminus and ends in a C terminus and folds up into a three-dimensional shape stabilised by non covalent interactions of various types. And pretty much no-one knows has ever heard of any of this.

In the present piece I had to dish up all the detail on protein composition and structure before getting into the nuts and bolts of the interaction between norovirus and the protein synthesis machinery of the cell that was central to my tale. This was no picnic, especially since the main actors all had awkward and forgettable names. Ladies and gentlemen, please welcome to the stage VPg, eIF4G, NS6 and – everyone’s favourite – the HEAT-1 domain*.

The first step was the first draft. I have learned just to power through this, whatever the quality. Get the story down on the page and work from there. So that’s what I did. Because the paper I was describing was still fresh in my mind and because of my desire – or is it the instinct or bad habit of the scientist? –  to immerse the reader in the flavours and smells of the laboratory, I suspected I may have overdone things. It felt good to have knocked out a draft but I had my doubts and took to Twitter to express them, which provoked a telling reply from physicist Helen Czerski:

Helen is a scientist who is not a structural biologist, so I thought I’d exploit the contact by asking her to read my draft. She was kind enough to agree.

And honest enough to tell me where I was going wrong: too much detail and too many acronyms that were getting in the way and likely to induce the general reader to bail out well before the end.

I took her advice and hacked at the piece, clearing out as much extraneous detail and jargon as I could. Or so I thought.

Not wanting to trouble Helen again I sent version 2 to Jenny Rohn, my co-conspirator here at Occam’s Typewriter and the Guardian. She’s a good editor, with an eye for telling detail and deviations from the rule of “show, don’t tell”. Her annotated version was full of helpful cuts, insertions and comments.

Following Jenny suggestions I re-wrote the start and end of the piece and clawed out some more unnecessary detail. I also added figures since I could not find a way to paint pictures with words alone. A failing perhaps, but when operating at the molecular level with engineered proteins that have no correspondents in everyday life there seems to be little alternative. Figures would hopefully provide support for the reader. At the very least, they would break up the text to make the piece seem less formidable. To counter the risk that they would give it the look of a text book I labelled the images using a font that resembled handwriting.

By this stage I was at version 4 and asked my wife, a non-scientist, to see what she made of it. By this stage most of the problems had been ironed out but she picked up one or two problems with sequencing (especially in the paragraph describing protein synthesis from RNA by the ribosome). She questioned the use of “complex” (to describe a cluster of proteins) and was unsure about “precursor” In the end I got rid of complex but felt that an interested reader could make an educated guess about “precursor”.

There was a final polish – I forced myself late in the day to read the post out loud to myself – and then I published.

Am I pleased with the final product? I’m not displeased, and some readers have left approving comments, but I still think I could have done better. I’m no Horace Judson, even if I might aspire in that direction. By the end I was bored of the piece. Fatigued. It would probably have been a good idea to leave it for a few days and then return afresh. The editorial assistance was a huge help but molecular material requires a level of devotion to make it come alive that I did not have time for on this occasion. But I have tried before and this stuff gnaws at me. It is a world worth exploring in words, so no doubt I will try again.

 

*Parenthetically, HEAT is officially the worst acronym ever. It stands for “Huntingtin, elongation factor 3 (EF3), protein phosphatase 2A (PP2A), and the yeast kinase TOR1). I kid you not.

Posted in Blogging, Protein Crystallography, Science | Tagged , | Comments Off on Anatomy of a blog post on the anatomy of a scientific discovery

ORCIDs set to bloom in 2016

Have you got an ORCID identifier yet? You should. They’re on the rise – and for good reason.

An ORCID iD is a number (mine is 0000-0002-0552-8870) that unambiguously and persistently identifies you in the digital world of academia. It ensures that your research activities, outputs, and affiliations can be easily and correctly connected to you. They are currently used by over 200 research and workflow platforms to identify and connect researchers with their grants and papers, at universities and research institutions, at funding agencies, and at publishers.

ORCID - SCurry

Around the world over 1.8 million researchers have registered for an iD, many in the hope that it will enhance their digital discoverability and reduce their reporting paperwork. Several funders have started to require ORCID iDs as part of the grant proposal process. In the UK, the Wellcome Trust and the NIHR both do, and ORCID IDs are being integrated with Researchfish, the system used by the Research Councils to track grant outputs.

Universities are getting in on the act too. In 2014 my own institution, Imperial College, a created ORCID iDs for  every member of staff who didn’t already have one, unless they opted out. Very few did so.

The number of publishers using ORCIDs is also on the rise. Today a group of eight publishers have announced that, beginning in 2016, they will require authors to use an ORCID identifier (iD) during the publication process. These are AAAS (publishers of the Science stable of journals), American Geophysical Union (AGU), eLife, EMBO, Hindawi, the Institute of Electrical & Electronics Engineers (IEEE), the Public Library of Science (PLOS) and the Royal Society. The Royal Society has been quickest off the mark, making ORCID iDs a requirement for authors as of new year’s day. The rest will follow suit at various dates throughout 2016.

With luck, this move with spur other publishers soon to join in.

In a digital world ORCID iDs make a great deal of sense. Their adoption by institutions and publishers fulfils two of the recommendations made in The Metric Tide, last year’s report of the Independent Review of the Role of Metrics Research Assessment and Management (of which I was a co-author). If we are going to track outputs, we might as well do it systematically and efficiently. I look forward to the day when interacting with Researchfish will be trivial rather than tedious, as at present. In theory, ORCID IDs might even reduce away some of the burden of the ever-unpopular Research Excellence Framework (REF) – though of course we shall have to await the outcome of the government’s re-jigging of HEFCE and the Research Councils before the contours of the next REF become clear. I wouldn’t hold my breath.

Of course by automating the digital tracking of inputs and outputs of ORCID iDs raises the risks associated with   unthinking uses of metrics – something that The Metric Tide was keen to warn against. On that front we need to remain vigilant. But on a personal level, most of us want to be recognised for our work and have an interest in making sure that our published outputs are recorded accurately. The ORCID system also provides handy way of keeping track of your published work. Thanks to the good offices of Europe Pubmed Central, you can use your ORCID iD to follow the open citations to your work. Here, in the interests of transparency, are mine: https://europepmc.org/authors/0000-0002-0552-8870. The profile is perhaps not as complete as that provided through Google Scholar but it is at least open.

If you want an ORCID iD of your own, simply sign up and use the tools to identify yourself and your work (papers, conference proceedings, patents – anything). You can also add your grants, and your education & employment history. For all of the information entered, it’s up to you how much to make publicly visible.

Update (2015-01-07, 14:02): This post was modified to mention the fact that the AAAS will also be requiring ODID iDs in 2016.

Posted in Academic publishing | 11 Comments

Henry Gee’s top ten reads of 2015

At the tail end of 2015 I reviewed the 23 books that had entertained and enlightened me over the course of the year. My friend Henry Gee, formerly of this parish, managed nearly twice that number. In a guest post full of his characteristic wit, he lists his top ten. 

On going through the list of 41 books read this year, I was amazed to come across titles I’d completely forgotten I’d read (which single fact shows the benefit of keeping a notebook.) Some of these books were terrific, by super authors – but they seem to have gone down without touching the sides. Others were old favourites, though most were new (to me, anyway.) Two, including Nick Lane’s ‘The Vital Question’ were sent to me to review by The Literary Review. This is such a fine magazine that I bought my father (a big reader) a subscription. It must be good, After all, they ask me to write things for them.

So here, without further ado, is my shortlist of those titles that in my opinion stand out from the crowd, in that they managed to be enjoyable at the time and are likely to remain long in the memory. All of these were new to me. I have counted them down from ten, though of course such a ranking is pretty rough, especially as the books cover a wealth of subjects and styles.

10. Neil McGregor – A History Of The World In 100 Objects
The chief panjandrum of the British Museum uses objects from that venerable Institution’s vast collections to tell the history of human culture from the earliest times to the present day. It’s the kind of book you feel you should dip into, but when you start you just can’t stop, and end up reading it like a novel, wondering what kind of object awaits you beyond the next page. The fascinating histories of the individual objects and the wider context allow one to forgive a tendency to preachy political correctness here and there.

9. Nick Lane – The Vital Question
The insides of living cells, whether from bacteria or human beings, look remarkably similar, right down to the level of molecules. It’s something we tend to take for granted, but the reason why is perhaps the single most important unanswered question in biology. Scientist Nick Lane goes into the reasons, and in so doing pulls out enthralling conjectures and perhaps unstoppable hypotheses. That he does so with a brio that threatens to boil over is part of the book’s charm, although it repays very careful reading even for those who have a scientific background. An important book.

8. Steve Silberman – Neurotribes
The fact that I have a daughter diagnosed with Asperger’s (notwithstanding inasmuch as which I score well over 30 on Simon Baron Cohen’s Asperg-o-meter) naturally drew me to this comprehensive history of Asperger’s and Autism, from Hans Asperger’s ‘Little Professors’ in pre-war Vienna right up to modern day SF fandom. I knew quite a lot of this already, but before this book nothing I’d read pulled it all together. It’s not Silberman’s fault that I find his particular way with words somewhat grating (he is a journalist for the US magazine Wired and writes hipster bloggy-style) but I soon overcame this. What struck me most about the subject was how much we know about mental illness comes from Jewish doctors who fled the Nazis and flourished in the US – and also how such history, as much as medical knowledge, has shaped our appreciation of what constitutes personality traits and mental health.

7. David Adam – The Man Who Couldn’t Stop
A brave and frank account of a crippling affliction by – as it happens – a colleague of mine at Nature. Adam suffers from Obsessive Compulsive Disorder, and in this short book he tells his own story interspersed with what we know about its genesis and the various ways that mental illness is classified. Adam is now doing well after a long battle, but the wider story is not so happy, for it is apparent (and I know this from my own experiences with depression) that the treatment of mental illness has barely advanced beyond the leeches-and-bloodletting stage.

6. Ben Elton – The First Casualty
I first came across Ben Elton back in the 1980s when he was a comedian, and some of his novels (especially ‘Gridlock’) are side-splittingly funny. ‘The First Casualty’ is very much darker. It concerns a somewhat priggish and sanctimonious policeman, who, in the First World War, finds that a strict regard for logic and truth leaves him the wrong side of the law. Jailed as a conscientious objector he ends up released to investigate a case of murder on the Western Front in the hell of Passchendaele in 1917. The novel brings the horror of trench warfare to life like no other book I have ever read, the aim being to raise questions of what constitutes murder, when people are being blown to bits in the cause of war. The novel stays just the right side of preachy, and is an engrossing and terrifying read.

5. Robert Harris – An Officer And A Spy
A retelling of the Dreyfus Affair as the imagined memoir of one of the (real) people who was there at the time. Almost all the characters really existed, and the events happened more or less as stated. Harris is one of those thriller writers who really does their research and yet wears it lightly, turning the story into a terrific and memorable read. His best since ‘Pompeii’.

4. Virgil (trans. W. F. Jackson Knight) – The Aeneid
I’ve been revisiting the classics this year – Gilgamesh, the Iliad, the Odyssey – so one of these really had to make the cut. I loved Homer, but Virgil (in this English prose translation) was a real surprise. What stands out is the fusion of lush, lyrical phrasing with what – when it comes down to it – is unremitting and graphic carnage. Virgil took Homeric ingredients and polished them to an even more lustrous sheen. Those who criticize as inauthentic a movie such as ‘300’ – a highly stylized rendering of the Battle of Thermopylae – fail to understand that it pays much greater homage to the source material than people might imagine.

3. Wilkie Collins – The Woman In White
I read this after reading a modern novel cast as a memoir by Wilkie Collins (Dan Simmons’ ‘Drood’ – see below.) When I did, I could only ask myself where this Victorian ‘sensation’ novelist and friend of Charles Dickens had been all my life. True, a long time ago I had started ‘The Moonstone’ but hadn’t got far. Perhaps now I am old and wizened enough to appreciate this mixture of terrific writing, trashy melodrama, highly contrived whodunit and gothic styling. The story is a rather ropey mystery but what stand out are the characters (the creepy Count Fosco is the best.) After reading this you realize that Collins was something of a pioneer. ‘The Woman In White’ was a hit when it was first published in 1860. Writers of mystery with a yen for the gothic, from Arthur Conan Doyle to Agatha Christie to Daphne Du Maurier – are in Collins’ debt.

2. Charles Dickens – The Pickwick Papers
When people say that they don’t much like Dickens my usual response is that I never get invited to that kind of party. Like most these days I only ever read Dickens because his novels feature on school examination syllabi. I read parts of ‘Great Expectations’ as a schoolboy; ‘Hard Times’ for my own A-levels; and picked up ‘Oliver Twist’ as it was a set book for my sister’s O-levels. Only later did I realize how much I had enjoyed these, so after having read Dan Simmons’ ‘Drood’ and Wilkie Collins’ ‘The Woman In White’ (see elsewhere in this essay) I felt I needed a Dickensian education. ‘The Pickwick Papers’ was Dickens’ first novel, a rollicking portrait of England in the 1820s, before Victoria and most of all before the railways completely changed the face of England, when people got around by stagecoach and stayed at coaching inns. The social background was, for me, quite an education (helped immeasurably by editorial notes from Mark Wormald: mine was the Penguin Classics edition.) The story – it’s more a soap opera – is basically one damn thing after another as Mr Pickwick and his friends get themselves in and out of various comedic scrapes. It starts a bit chaotically and only gets into its stride when we meet Sam Weller, Mr Pickwick’s valet, confidant, source of homespun wisdom and everyday superhero, a cross between Falstaff and Sam Gamgee. The various meetings between Sam and his father are pure comedy gold, like Peter Cooke and Dudley Moore, Ronnie Barker and Ronnie Corbett – or Johnny Depp’s Jack Sparrow and Keith Richards in the ‘Pirates of the Caribbean’ movies. Amazing to think that Dickens wrote it when he was twenty-four. With little of the tub-thumping social crusading of his later works, Pickwick is pure enjoyment.

1. Dan Simmons – Drood
Another tale – qua ‘The Dreyfus Affair’ – done as an imagined memoir of real characters and events. This time the narrator is the novelist, hypochondriac and opium addict Wilkie Collins, and the tale is of the last years of his friend Charles Dickens. Brilliantly researched, it manages to be playful and fantastical within the confines of history. Simmons uses Collins’ opium addiction to give the fabric of reality a thorough work-out: reminiscent of Peter Shaffer’s treatment of Mozart and Salieri in his play ‘Amadeus’. Only more gothic. Lots more gothic. (I do LOVE gothic.) This novel got me back into the classics – I read ‘The Woman In White’ immediately afterwards, and ‘The Pickwick’ Papers’ soon after (though I have yet to tackle Dickens’ last, unfinished novel ‘The Mystery of Edwin Drood’.) I award ‘Drood’ the enviable accolade of my Read of the Year.

 

Posted in Book Review, Science & Art | 3 Comments

ICYMI No.1: Preprints for biologists

Since I have developed a habit of writing elsewhere, which necessarily takes time and words away from the blog here at Reciprocal Space, I thought I would try to make amends by developing the habit of linking to the pieces that appear in other corners of the internet. 

To kick off therefore, permit me to alert you to a short article this week published in The Biologist, the house magazine of the Royal Society of Biology. The piece – The power of preprint – follows from an earlier article on this topic in the Guardian and reprises the call, by myself and others, for more of us in the life sciences to adopt the practice of publishing our research quickly in preprint form. It’s worked in many branches of physics, maths and computer science for many a long year and I see no reason that we should deny ourselves the benefits of preprints in other research disciplines. And nor does Ron Vale, as I mentioned here a few months back.

Posted in ICYMI, Open Access, Science | Tagged , , | Comments Off on ICYMI No.1: Preprints for biologists

Jolly good fellows: Royal Society publishes journal citation distributions

Full marks and a side order of brownie points for the Royal Society: they have started publishing the citation distributions for all their journals.

This might seem like an unusual and rather technical move to celebrate but it matters. It will help to lift the heavy stone of the journal impact factor that has been squeezing the life out of the body scientific. The Royal Society has now joined the EMBO Journal in committing to be more transparent about the origins of this dubious and troubling metric.

I don’t wish to rehearse the details in full since I have previously described the pernicious effects of scientists’ and publishers’ obsession with journal impact factors and the value of making citation distributions available. But in brief: I hope that the ready availability of these distributions – to show the skew and variation of the citation performance of the papers in any journal – will enable researchers to develop a more sophisticated approach to the evaluation of the work of our peers.

The image below shows the citation distribution for the Royal Society’s Proceedings B journal – you can find the original by clicking on the ‘Citation Metrics’ link in the ‘About Us’ tab. As is the case for every academic journal, it shows that the impact factor, an arithmetic mean, is an indicator that over-estimates average performance and conceals a huge range in citation counts.

ProcB Citation Distribution

As I wrote back in June:

…the IF is a poor discriminator between journals and a dreadful one for papers. Publishing citation distributions therefore directs the attention of anyone who cares about doing evaluation properly back where it belongs: to the work itself.

So three cheers for the Royal Society for having the courage to be so open with these data!

Now: who’s next?

I have been discussing the idea of making citation distributions available with a number of other publishers and have heard some encouraging noises. It seems likely that there will be further significant moves in this direction in the new year, which I will be happy to report.

I dare to be optimistic that before too long the practice may become widespread, and that we may have at our disposal a tool that helps us to do a better job of assessing research and researchers. This is by no means a revolution and we all know that old habits die hard. Even so, this is a step in the right direction and I will take what I can get.

Update (05/12/15, 00:55): Better late than never, but I really should have thanked the Royal Society’s publishing director, Dr Stuart Taylor, for taking this initiative forward.

Update (05/12/15, 01:05): Well I didn’t have to wait long to find out which would be the next journal to join the club. Nature Chemistry’s editor Dr Stuart Cantrill crunched the numbers, posted the distribution and has written up his analysis on the Sceptical Chymist blog. He tells me there’ll be a link to his post from the journal homepage when it’s next updated.

Update (11/12/15, 11:26): Stuart Cantrill has clearly caught the citations distribution bug. He’s now also done a very nice comparative analysis of a selection of chemistry journals. It shows, as expected, that the distributions are all approximately of the same shape and helped to re-infornce the message that journals with low impact factors can be relied on to still have papers that attract large numbers of citations.

Posted in Open Access, Science | 4 Comments

Structural Biology: a beginner’s guide?

I got impatient waiting for my latest review article to come out, so here it is. The scheduled publication date has slipped twice now without the publisher getting in touch to explain why. The latest I’ve heard, after querying the editor who commissioned the piece, is that it will be out by the end of the month. But I’ve paid my £500 fee to make the work open access and don’t see any good reason to delay further.

My review, titled ‘Structural Biology: a century-long journey into an unseen world’, is a contribution to an upcoming issue of Interdisciplinary Science Reviews that will commemorate the centenary of the 1915 Nobel prize in physics awarded to William and Lawrence Bragg, the father and son team that first used X-rays to ‘see’ the atomic structure of matter. It traces the developments in structural biology that over the past 100 years – with and beyond X-rays – have revealed to us the fascinating molecular world that lies beneath our senses.

As befits an interdisciplinary journal, I tried to write my review for a general readership, which I hope to broaden further by making it available here. I doubt I have freed myself from all the bonds of the scholarly habit of writing but I hope the article might appeal to the interested amateur. As a taster, here are the opening paragraphs:

When Orville Wright took off in the Flyer on a grey morning in December 1903 and flew for all of 12 seconds across the sands near Kitty Hawk in North Carolina, little could he have suspected that by 1969 powered flight would land Neil Armstrong and Buzz Aldrin on the Moon. Humankind’s first foray onto another world remains for many people one of the greatest technological achievements of the 20th Century. But within the 66 years it took to get from Kitty Hawk to Tranquillity Base another equally remarkable technological – and scientific – journey took place, one that has brought us to a very different destination.

In 1912, in experiments initiated by Max von Laue in Germany and successfully analysed by William and Lawrence Bragg in England, X-rays were first used to peer into the atomic structure of crystalline matter. By the end of the 1950s X-ray crystallography had leapt from physics to chemistry to biology and the atomic architecture of DNA and several proteins had been revealed, giving us the first glimpses of a molecular landscape that was no less surprising and no less strange than the surface of the Moon. It had taken just five decades for structural biology to emerge as a fledgling discipline. In the five that have since elapsed the field has grown vigorously, thanks not only to developments in X ray crystallography but also to the emergence of complementary techniques that have used other physical phenomena to lift the veil on an unseen world – the atomic and molecular matrix of life.

If you want to know more, you can download the PDF (8.6 MB).

Update (11 Dec, 15:46): This article was finally published by Interdisciplinary Science Reviews at the end of November. It’s open access so you can now download the journal-fomatted version for free.

 

Posted in Protein Crystallography, Science | Tagged , , , , | 5 Comments

Lunacy and sanity

It’s less than 24 hours, so this still counts as a timely post.

I guess I had been primed because I had been thinking about it. But although I hadn’t set my alarm I found myself awake at 02:52 on Monday morning – I can still see the digital display – and so I got up, checked out the window that the moon was visible (it was – and already mostly eclipsed), dressed and hurried downstairs, grabbing my camera and binoculars on the way.

To my disappointment I realised after a quick search that my camera tripod has been left at the office and resigned myself to hand-held, or at least fence-supported, photography.

No matter. The point as to enjoy the moment. We drove to France to catch the solar eclipse in 1999, and I rose at dawn on a sunny day in June 2012 to witness the last transit of Venus of my lifetime. Having woken, I was determined to experience the eclipse of the supermoon.

What is it about these instances of celestial mechanics that are so appealing? I think it may be that they lay bare the machinery of the cosmos. They establish a connection that is understood, known – but otherwise invisible. They are special moments of conjunction – not in the astrologers’ fantastical nonsense – but in the sense that reality is (as we too often forget) astonishing and fantastic.

And so, last night, I watched the curve of the Earth slide across the face of the moon. I participated in the geometry of our place in the solar system, an experience made more real, but also more surreal, by the red, angry aspect the moon took on as it descended into full eclipse. For all the world, it might last night have been made of molten iron. In a magical and scientific moment the familiar was made unfamiliar.

But still and all it made perfect sense. The blood moon is no more than the product Newton’s cosmic clockwork, whatever Einstein might protest. As I am a cog in the wheel of my life, the machinations of which I discern less clearly than I might wish. But if every now and then I can apprehend the unfathomable mystery of the heavens by observing it’s predictable machinations, then perhaps there is also something to hold on to, for the sake of my own sanity.

Posted in Science | Comments Off on Lunacy and sanity

Ch-ch-ch-changes…

There’s a very real chance this could turn out to be an actual blogpost. In the original sense of the word: a web-log of what’s been happening.

Posts have been rather sparse on Reciprocal Space of late. That’s not for a want of words. It’s just that they have been expended elsewhere – over at the Guardian, in pieces about photographypreprints, the latest Science is Vital campaign (please join in) or Nicole Kidman’s performance as Rosalind Franklin in Anna Ziegler’s intelligent Photograph 51; or in the Times Higher writing about my term of office as Director of Undergraduate Studies (DUGS); or the article I wrote on the use and abuse of metrics (PDF) in research evaluation for the Indian journal, Current Science (my first foray onto the sub-continent); or in the latest paper to come from my lab, currently in revision but already available on the bioRxiv.

This dispersion of words away from my original blog home, which passed its seventh anniversary earlier this month by the way, is symptomatic of the calls of scientific duty, but also of the churn of events. Unintended outcomes that flow from the simple fact of having set out in a particular direction. I’m not complaining (or apologising – I said I would never apologise), just observing, though I didn’t intend for things to become so fallow at Reciprocal Space.

These are not the only changes in the offing. Yesterday evening I trudged home through a grey wet veil – Autumn’s warning to Summer that it is time to go. The sense of transition was reinforced this past weekend as we delivered our youngest to university. All three of our children have now flown the nest. My wife and I looked at each other. “What are we going to do now?”

Plenty, we hope. For myself I look forward to having more time for science. As I wrote in the Times Higher, at the end of this month I will step out of the heavy harness I have been wearing as DUGS. Truth be told, even as the contours of the new term take shape on the horizon, I already feel the administrative burden slipping from my shoulders. Not only will I not have to deal with the myriad tasks demanded by that role, but I will be entering a sabbatical year that I hope more than anything will give me a chance to think.

That thinking time is long overdue thanks to the familiar but irregular movements known to anyone absorbed by a life in science. Another research grant has just come to end, so another postdoc is leaving the lab and moving to pastures new. This is a good move for her but my task now is to refill the funding pot, a far from trivial endeavour. Money’s tight and the noises coming from the government are not encouraging. I need to dive once more into the waves of innovation and discovery. For crystallographers like myself, the spectacular rise of new techniques in cryo-electron microscopy is a challenge, but also an opportunity.

These moments of transition come around again and again. The scientific life is one of motion. The plates shift on the hot mantle. Looking up, I can see that the landscape has changed. But that’s OK: it is something to explore.

 

Posted in Scientific Life | Tagged , , , | 4 Comments

Pre-prints: just do it?

There is momentum building behind the adoption of pre-print servers in the life sciences. Ron Vale, a professor of cellular and molecular pharmacology at UCSF and Lasker Award winner, has just added a further powerful impulse to this movement in the form, appropriately, of a pre-print posted to the bioRxiv just a few days ago.

If you are a researcher and haven’t yet thought seriously about pre-prints, please read Vale’s article. It is thoughtful and accessible (and there is a funny section in which he imagines the response of modern-day reviewers to Watson and Crick’s 1953 paper on the structure of DNA). His reasoning is built on the concern that there has been a perceptible increase over the last thirty years in the amount of experimental data – and therefore work – required for PhD students to get their first major publication. Vale argues that this is a result of the increased competition within the life sciences, which is focused on restricted access to ‘top journals’ and is in turn due to the powerful hold that journal impact factors now have over people’s careers. Regulars readers will know that the problems with impact factors are a familiar topic on this blog (and may even be aware that their mis-use was one of the issues highlighted in The Metric Tide, the report of the HEFCE review of the use of metrics in research assessment that was published last week).

Michael Eisen has written a sympathetic critique of Vale’s paper. He takes some issue with the particulars of the arguments about increased data requirements but nevertheless espouses strong support for the drive – which he has long pursued himself – for more rapid forms of publication.

I won’t bother to rehearse Eisen’s critique since I think it is the bigger picture that warrants most attention. This bigger picture – the harmful effect of the chase after impact factors on the vitality and efficiency of scientific community – emerged as a central theme at the Royal Society meeting convened earlier this year to discuss the Future of Scholarly Scientific Communication. At that gathering I detected a palpable sense among attendees that the wider adoption of pre-print servers would be an effective and feasible way to improve the dissemination of research results; (for more background, see proposal 3 towards the bottom of my digest of the meeting).

Vale’s article does an excellent job of articulating the support for pre-prints that bloomed at the Royal Society meeting. I urge you again to read it. But for the tl;dr (“too long; didn’t read”) crowd, here’s the key section on the arguments for pre-prints (with my emphases in boldface)*:

1) Submission to a pre-print repository would allow a paper to be seen and evaluated by colleagues and search/grant committees immediately after its completion. This could enable trainees to apply for postdoctoral positions, grants, or jobs earlier than waiting for the final journal publication. A recent study of several journals found an average delay of ~7 months from acceptance to publication (33), but this is average depended upon the journal and the review/revision process can take longer on a case-by-case basis. Furthermore, this time does not take rejections into account and the potential need to “shop” for a journal that will publish the work.

2) A primary objective of a pre-print repository is to transmit scientific results more rapidly to the scientific community, which should appeal to funding agencies whose main objective is to catalyze new discoveries overall. Furthermore, authors receive faster and broader feedback on their work than occurs through peer review, which can help advance their own studies.

3) If widely adopted, a pre-print repository (which acts an umbrella to collect all scientific work and is not associated** with any specific journal) could have the welcoming effect of having colleagues read and evaluate scientific work well before it has been branded with a journal name. Physicists tend to rely less on journal impact factors for evaluation, in part, because they are used to reading and evaluating science posted on arXiv. Indeed, some major breakthroughs posted on arXiv were never published subsequently in a journal. The life science community needs to return to a culture of evaluating scientific merit from reading manuscripts, rather than basing judgment on where papers were published and hence outsourcing the career evaluation process to journals.

4) A pre-print repository may not solve the “amount of data” required for the next step of journal publication. However, it might lower the bar for shorter manuscripts to be posted and reach the community, even if an ensuing submission to a journal takes longer to develop.

5) A pre-print repository is good value in terms of impact and information transferred per dollar spent. Compared to operating a journal, the cost of running arXiv is low (~$800,000 per year), most of which comes from modest subscription payments from 175 institutions and a matching grant from the Simons Foundation. Unlike a journal, submissions to arXiv are free.

6) Future innovations and experiments in peer-to-peer communication and evaluation could be built around an open pre-print server. Indeed, such communications might provide additional information and thus aid journal-based peer review.

7) A pre-print server for biology represents a feasible action item, since the physicists/mathematicians have proof-of-principle that this system works and can co-exist with journals.

The last point is perhaps the most important. Publishing pre-prints is a feasible step. I have started to do it myself in the past year (partly motivated by deals offered by PeerJ) and it is a practice that I intend to continue.

But the key will be to get more and more life scientists to adopt the pre-print habit. Leadership on this from senior figures – academicians, fellows of the Royal Society, prize winners and the like – will help. Institutional support from funders and universities, by which I meant putting in place incentives for rapid communication of results, could also be important. The rest of us have to face this idea seriously and at the very least be willing to debate the pros and cons – I would welcome that discussion.

Or you could see the sense publishing preprints and just do it.

 

*Thanks to Ron Vale for permission to reproduce this section of his paper.

**In Vale’s article this phrase is ’not unassociated’ but I suspect that’s a typo.

Posted in Open Access, Science, Scientific Life | 16 Comments

Data not shown: time to distribute some common sense about impact factors

It’s that time of year when all clear-thinking people die a little inside: the latest set of journal impact factors has just been released.

Although there was an initial flurry of activity on Twitter last week when the 2015 Journal Citation Reports* were published by Thomson Reuters, it had died down by the weekend. You might be forgiven for thinking that the short-lived burst of interest means that the obsession with this damaging metric is on the wane. But this is just the calm before the storm. Soon enough there will be wave upon wave of adverts and emails from journals trumpeting their brand new impact factors all the way to the ridiculous third decimal place. So now is the time to act – and there is something very simple that we can all can do.

For journals, promotion of the impact factor makes a kind of sense since the number – a statistically dubious calculation of the mean number of citations that their papers have accumulated in the previous two years – provides an indicator of the average performance of the journal. It’s just good business: higher impact factors attract authors and readers.

But the invidious effects of the impact factor on the business of science are well-known and widely acknowledged. Its problems have been recounted in detail on this blog and elsewhere. I can particularly recommend Steve Royle’s recent dissection of the statistical deficiencies of this mis-measure of research.

There is no shortage of critiques but the impact factor has burrowed deep into the soul of science and is proving hard to shift. That was a recurrent theme of the recent Royal Society meeting on the Future of Scholarly Scientific Communication which, over four days, repeatedly circled back to the mis-application of impact factors as the perverse incentive that is at the root of problems with the evaluation of science and scientists, with reproducibility, with scientific fraud, and with the speed and cost of publishing research results. I touched on some of these issues in a recent blogpost about the meeting; (you can listen to recordings of the sessions or read a summary).

The Royal Society meeting might have considered the impact factor problem from all angles but  discovered once again – unfortunately – that there are no revolutionary solutions to be had.

The San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto are commendable steps in the right direction. Both are critical of the mis-use of impact factors and foster the adoption of alternative processes for assessment. But they are just steps.

That being said, steps are important. Especially so if the journey seems arduous.

Another important step was made shortly after the Royal Society meeting by the EMBO Journal and is one that gives us all an opportunity to act. Bernd Pulverer, chief editor of EMBO J., announced that the journal will from now on publish its annual citation distributions, which comprise the data on which the impact factor is based. This may appear to be merely a technical development but it marks an important move towards transparency that should help to dethrone the impact factor.

 

EMBO J. - Citation Distributions

 

The citation distribution for EMBO J. is highly skewed. It is dominated by a small number of papers that attract lots of citations and a large number that garner very few. The journal publishes many papers that attract only 0, 1 or 2 citations in a year and a few that have more than 40. This is not unusual – almost all journals will have similarly skewed distributions – but what it makes clear are the huge variations in citations that the papers in any given journal attract. And yet all will be ‘credited’ with the impact factor of the journal – around 10 in the case of EMBO J.

By publishing these distributions, the EMBO Journal is being commendably transparent about citations to its papers. It is not just a useful reminder that behind the simplicity of reducing journal performance to a single number is an enormous spread in the citations attracted by individual pieces of work. As Steve Royle’s excellent analysis reveals, the IF is a poor discriminator between journals and a dreadful one for papers. Publishing citation distributions therefore directs the attention of anyone who cares about doing evaluation properly back where it belongs: to the work itself. The practice ties in nicely with articles 4 and 5 of the Leiden Manifesto.

So what can you do? Simple: if in the next few weeks and months you come across an advert or email bragging about this or that journal’s impact factor, please contact them to ask why they are not showing the data on which the impact factor is based. Ask them why they are not following the example set by the EMBO Journal. Ask them why they think it is appropriate to reduce their journal to a single number, when they could be transparent about the full range of citations that their papers attract. Ask them why they are not showing the data that they rightly insist authors provide to back up the scientific claims in the papers they publish. Ask them why they won’t show the broader picture of journal performance. Ask them to help address the problem of perverse incentives in scientific publishing.

*The title is somewhat confusing since the 2015 JCR contains the impact factors calculated for 2014.

Posted in Open Access | 10 Comments