To build a successful career in scientific research you need to understand the scientific publishing system. It is going through a period of change and innovation but has remained largely intact. Recently I and a colleague ran some ‘Disruptive Publishing’ coffee break sessions to highlight some of the changes in science publishing to our researcher community. I produced a factsheet summarising interesting journal developments and my colleague created a colourful ‘snapper’ that gave us a way to open up conversations with unsuspecting researchers.
Wikipedia defines disruptive innovation as “innovation that creates a new market or value network and eventually disrupts an existing market and value network, displacing established market-leading firms, products, and alliances” and it notes that “disruptive innovations tend to be produced by outsiders rather existing players”.
Michael Clarke, in his 2010 Scholarly Kitchen blogpost, pointed out that Tim Berners-Lee created the Web in 1991 with the aim of “better facilitating scientific communication and the dissemination of scientific research … [it] was designed to disrupt scientific publishing”.
Clarke observed however that there had as yet been no significant disruption; change and innovation yes but not disruption. He was writing in 2010 but that is still true. The main players – the large multinational commercial publishing houses – still dominate science publishing. The biggest open access publisher is one of the top four science journal publishers – SpringerNature.
Clarke went on to list five different functions of the publishing system and suggested that the functions which are more ‘social’ – validation, filtration and designation – are less amenable to disruption than those that are more administrative – dissemination and registration.
We are starting to see more change in those areas. Validation (peer review) has been shaken by the advent of megajournals such as PLOS ONE and Scientific Reports. These journals are based on the idea that articles should be published if they are scientifically valid, leaving aside issues of novelty or newsworthiness. Notions of peer review are also being stretched by increasing posting of preprints (e.g. bioRxiv) and publishing platforms like Wellcome Open Research.
Filtration (deciding what is worth reading) is still heavily influenced by journal branding and thus by editorial selection, but altmetrics provide another tool while work on recommendation engines is proceeding in several companies. Machine learning approaches to search tools are also starting to appear.
Registration (research assessment) is perhaps the thorniest problem, and typically journal branding remains important when assessing a portfolio of research. Many institutions and funders have signed up to the principles of DORA but fewer have taken steps to put them into practice and practically problems remain. It is good to see DORA taking steps to engage more with the research community and spread good practice.
Coffee break sessions
We don’t have a revolution in publishing yet, but there is plenty of change and innovation and I think it’s hard for a busy researcher to be on top of everything that’s going on. Our coffee break sessions and the factsheet about Disruptive Publishing were intended to brief researchers about some of the more interesting developments. During five separate sessions (one on each lab floor plus one on the ground floor) we talked to more than 50 people. The topics that generated most interest were
- the scooping policy of PLOS
- preprints and bioRxiv
- publishing different research outputs, not just articles
- Frontiers for Young Minds – the science journal for school students.
The ‘snapper’ or ‘fortune teller’ that my colleague created provided a useful gambit to start conversations. We asked people to choose a number between one and eight and then talked about the issue corresponding to that number in the snapper.
It was definitely worth running these sessions but we learnt that the only way to make them work was to ‘ambush’ people while they were making a cup of coffee or washing their cup, or just walking along. They were nearly all interested to talk to us and to learn about what we had to tell them.
Journals started in 1665 with the Royal Society. The format of research articles changed little – fonts changed, different languages gained the ascendant, colour started to appear. But the outline of a journal article remained instantly recognisable. There was a big change in delivery in the 1990s when the internet came into play but not much underlying change. The growth of open access in the past decade and the advent of new online-only publishing ventures has accelerated the pace of change.
Pure OA journals
Journals like PLOS Biology, Nature Communications, eLife and … publish only fully open access articles. By publishing in these journals you can be sure that all OA obligations are met, and your research is as open as possible (but you still need to be sure to choose the CC BY licence). PLOS Biology will consider for publication manuscripts that confirm or extend a recently published study (“scooped” manuscripts, also referred to as complementary).
These journals typically have a wide subject scope and focus on ‘technical soundness’, rather than criteria such as ‘importance’ and ‘interest’. The two leading examples are PLOS ONE and Scientific Reports. They are the two largest journals of any kind, each publishing over 20,000 articles p.a. Most other megajournals are somewhat smaller, but still focus on soundness as the main criterion for publication.
Several journals have developed new systems to improve peer review. The Frontiers in journals have an initial independent review phase followed by a second phase in which reviewers and author interact. Reviewers’ names are published alongside the article. eLife delivers fast peer review decisions and consolidates all revision requests into one set of revisions. Post-review decisions and author responses for published papers are available for all to read. F1000Research publishes all submissions as preprints and then invites referees to judge the papers. Their reports and names are published alongside the article, together with the authors’ responses and comments from registered users. Wellcome Open Research follows a similar process. There is a Crick gateway on Wellcome Open Research.
bioRxiv is a free online archive and distribution service for unpublished preprints in the life sciences. Authors are able to make their findings immediately available to the scientific community and receive feedback on draft manuscripts before they are submitted to journals. Most funders now accept preprints in grant aplications. There is a Crick channel on bioRxiv highlighting our preprints.
Publishing different outputs
Wellcome Open Research accept a wide range of submissions, including software, data notes, study protocols, negative or null studies, replication and refutation studies. BMC Research Notes publishes scientifically valid research outputs that cannot be considered as full research or methodology articles. The Research Ideas and Outcomes (RIO) journal publishes all outputs of the research cycle, including: project proposals, data, methods, workflows, software and project reports. The Journal of Brief Ideas publishes citable ideas in fewer than 200 words. Science Matters publish single, validated observations – the fundamental unit of scientific progress.
These journals publish data papers –papers that describe a particular online accessible published data set, or group of data sets. Examples include Scientific Data, Gigascience, Data in Brief.
The protocols.io website allows you to create, copy, modify and evolve laboratory protocols, describing the critical details of experimental procedures that are often overlooked in articles Methods sections.
Frontiers for Young Minds is an open access science journal aimed at school students. It invites scientists to write about their research in language that is accessible for young readers. Articles are reviewed before publication by a board of kids and teens – with the help of a science mentor.
Some journals, notably eLife, ignore journal impact factors and do not use them in promotion. The San Francisco Declaration on Research Assessment (DORA) is developing and promoting best practice in the assessment of scholarly research, and argues against the use of journal-level metrics like the impact factor.