An Open Letter on Open Access to UK Research Councils

Short Version

Please read the Wellcome Trust’s policy on open access. And then adopt it. Thank you.

 

Long Version

Please read the Wellcome Trust’s policy on open access. It’s short so I’ve pasted it below. The policy states (with my emphases in purple):

The mission of the Wellcome Trust is to support the brightest minds* in biomedical research and the medical humanities.

The main output of this research is new ideas and knowledge, which the Trust expects its researchers to publish in high-quality, peer-reviewed journals.

The Wellcome Trust believes that maximising the distribution of these papers – by providing free, online access – is the most effective way of ensuring that the research we fund can be accessed, read and built upon. In turn, this will foster a richer research culture.

The Wellcome Trust therefore supports unrestricted access to the published output of research as a fundamental part of its charitable mission and a public benefit to be encouraged wherever possible.

Specifically, the Wellcome Trust:

  • expects authors of research papers to maximise the opportunities to make their results available for free
  • requires electronic copies of any research papers that have been accepted for publication in a peer-reviewed journal, and are supported in whole or in part by Wellcome Trust funding, to be made available through PubMed Central (PMC) and UK PubMed Central (UKPMC) as soon as possible and in any event within six months of the journal publisher’s official date of final publication
  • will provide grant holders with additional funding, through their institutions, to cover open access charges, where appropriate, in order to meet the Trust’s requirements
  • encourages – and where it pays an open access fee, requires – authors and publishers to license research papers such that they may be freely copied and re-used (for example for text and data-mining purposes), provided that such uses are fully attributed
  • affirms the principle that it is the intrinsic merit of the work, and not the title of the journal in which an author’s work is published, that should be considered in making funding decisions.

As a policy it is clear and purposeful. It is built upon the principle that OA is good for science and good for the public. The policy is backed by funding that is available to Trust-funded researchers even after their grant support has finished. And, most pleasingly of all, although there is an expectation that the scientists it funds should publish in ‘high-quality’ journals, at the same time the Trust makes it clear that chasing after impact factors is not the point — there is a commitment in its funding decisions to judge work on its merit. If this goal can be realised, it will break one of the heaviest chains tying scientists to the status quo.

Now contrast the Wellcome Trust’s statement with a typical open access policy, say the one from the BBSRC (the funder that I am most familiar with – but the MRC policy looks rather similar). If you click on the link you will be taken to a web-page that contains two small PDFs. Each contains just a few paragraphs of text. The first is an initial statement from June 2006 of their OA policy and the second is an update that was published in late 2008. The organisation of these texts is a bit haphazard — and so are the declared aims.

In my view neither document has the drive or focus of the Wellcome Trust’s policy. The aim is to ‘encourage’, rather than oblige. The financial support on offer is more limited and more difficult to access (no pun intended but it is ironic). As outlined in the notes for BBSRC applicants (PDF), OA charges expected within the term of the grant should be put down as direct costs. Those expected after the end of the grant term should be charged to the overheads paid to the university holding the grant. Good luck with forecasting your need for OA funding accurately. This system is complex and it doesn’t work.

As a case in point, consider my most recent paper, submitted to the Elsevier journal Structure. Since the work had been funded by the BBSRC, that meant I had to pay Structure’s $5000 OA fee. Yes, $5000. I know. But the grant had finished between submission and acceptance so the BBSRC told me funds to cover that charge should be provided by my institution from the fEC payment (the overhead). My institution informed me this money was spent and when I relayed this to the BBSRC, I was instructed to take the green OA option — self-archive my version of the revised, peer-reviewed manuscript (not the Structure reprint). This is very much a second best option. I have put the PDF on my personal web-site and in my institution’s depository (Spiral) but am not allowed by Elsevier’s terms and conditions to upload my paper to UKPMC, where it would be much easier for the user community to find.

This was, in my view, an avoidable shambles, brought on by the combination of Elsevier’s high OA charge (at least for its Cell Press titles) and the Research Council’s lack of a proper funding mechanism.

But change is afoot, albeit slowly. In May last year, Science Minister David Willetts announced a renewed commitment to OA, to be implemented by Research Councils UK (RCUK) and the Higher Education Funding Council for England (HEFCE). This intention was reaffirmed in December upon publication of the government’s “Innovation and Research Strategy for Growth” document (PDF). This states:

“The Government, in line with our overarching commitment to transparency and open data, is committed to ensuring that publicly-funded research should be accessible free of charge. Free and open access to taxpayer-funded research offers significant social and economic benefits by spreading knowledge, raising the prestige of UK research and encouraging technology transfer.”

Well that’s good. However, the document also revealed plans to establish “an independent working group chaired by Janet Finch to consider how to improve access to research publications, including publicly-funded research.”

It’s not clear to me why this is necessary. In a 2009 report (PDF) RCUK had already identified the main obstacles to the wider adoption of OA:

  • a perception amongst many disciplines that OA journals lack impact
  • limited awareness amongst researchers of funding sources for pay-to-publish models
  • non-compliance with institutional policies which mandate self-archiving in institutional repositories.

The first of these points would be addressed by adoption of a statement identical to that in the Wellcome Trust policy,  providing proper funding to show that OA matters and, crucially, emphasising that the Research Councils would base funding decisions on quality, not impact factors (though this would have to be backed by similar expressions by promotion panels in universities). The two remaining difficulties could be dealt with by stipulating that researchers who did not comply would soon find themselves ineligible for grant funding. Such a position would increase the awareness and compliance of scientists. I guarantee it.

The RCUK report even made the point that the push for Gold OA — publication in OA journals that do not charge subscription fees — could still permit commercial publishers to make a decent living, though perhaps not an exorbitant one, as is the case at present for the major publishing companies. The drive for OA engendered by a proper policy, seems likely to stimulate greater interest among scientists in OA enterprises like PLoS and could even enhance the marketplace by creating some real competition and so driving down costs.

So please adopt the Wellcome trust policy on Open Access. Thank you.

 

*Brightest minds – alas the Wellcome Trust doesn’t get everything right. Their funding policy means concentrating their resources in fewer hands, choking off a mixed economy of research funding that has worked well and is better able to sustain university-based research (as I have written elsewhere).

This entry was posted in Open Access, Science, Scientific Life and tagged , , , , , . Bookmark the permalink.

60 Responses to An Open Letter on Open Access to UK Research Councils

  1. deevybee says:

    Well said, Stephen.
    And as someone who is involved in Wellcome funding decisions, please note that I absolutely endorse the last para, i.e. it is “the intrinsic merit of the work, and not the title of the journal in which an author’s work is published, that should be considered in making funding decisions.”
    Too many junior scientists delay publishing by chasing after high impact journals when they’d be better off getting the work out promptly in a decent, and preferably open access, journal.

  2. Stephen, this is great!

    One of the somewhat disturbing things about the Finch committee is that there is no-one represented on that committee that has actually run a financially sustainable pure Open Access operation. Robert Kiley is on the committee from the Wellcome Trust and that’s great but there’s no-one there from PLoS or BMC, and there is no-one representing the experience of the SCOAP3 process by which High Energy Physics journals are being taken OA.

    http://www.researchinfonet.org/publish/wg-expand-access/

    The committee also seems to have interpreted its role as negotiating some sort of national UK licence to read the world’s literature, rather than how to expand access to UK generated literature. I don’t quite know where this came from as its not how I would read the published terms of reference:

    http://www.researchinfonet.org/wp-content/uploads/2012/01/Working-Group-ToR.doc

    …but such an approach seems to be setup so as to fail. Negotiating patchwork licenses for lots of different people across the UK is exactly the wrong approach to take. Indeed it illustrates how backwards we have allowed the whole system to get.

    • Stephen says:

      Cheers Cameron – trust you to be so well informed! That’s somewhat disturbing news about the composition of the committee.

      Would be good to know how the Elsevier Boycott has impacted on their deliberations.

  3. I’ve always wondered why the institutional repositories aren’t working with, e.g. PubMed etc. to make sure a link to their version is displayed with the search results. I mean, how difficult can this be?

    • Stephen says:

      Me too. I wonder do publishers prevent it (as Elsevier has prevented me depositing my manuscript with UKPMC)?

      Would be useful if PubMed searches returned links to institutional repositories. But maybe the reason is it’s too complex for them to verify each source?

      • They already link several sources. If there were a single front-end for all repositories, it’d only be one additional button. And then green OA all of a sudden became the threat to TA it ought to be.

      • Mike Taylor says:

        Bjoern: “I’ve always wondered why the institutional repositories aren’t working with, e.g. PubMed etc. to make sure a link to their version is displayed with the search results.”

        Stephen: “Me too. I wonder do publishers prevent it (as Elsevier has prevented me depositing my manuscript with UKPMC)?”

        Surely not. It’s well established now that merely linking to something is non-infringing.

        • Neil Stewart says:

          So the idea here is to expose IR content via e.g. PubMed, ArXiv etc.? Some aggregation services do exist, e.g. Economists Online http://www.economistsonline.org/home and of course IR content shows up very well in Google, Google Scholar etc. The trick for repository managers would be identifying relevant content to go into relevant subject repositories. I’m going to put a message out to the UKCORR user group list and see what fellow repository managers think.

          • Stephen says:

            Neil – Would be good to hear if you get any response.

            • Neil Stewart says:

              Reporting back on this issue of aggregating IR content- I’ve posted on this here: http://ukcorr.blogspot.com/2012/03/unfulfilled-promise-of-aggregating.html

            • Here’s what I wanted to comment on Neil’s post, but the blog there didn’t let me post:

              The fragmentation of the IR landscape as well as the policy issues mentioned by ostephens are well taken and serve as important points of entry in terms of killing a lot of birds with one or at least very few stones.

              If the landscape is fragmented, in part due to “the usual open access and service awareness-raising and advocacy” constraining time of the involved staff, I wonder if some priorities might be in need of re-setting. A PubMed link to IR content is probably the biggest possible OA advocacy effect for the BioMed research community, who are largely completely clueless. Allowing a significant fraction of those easy full-text access from anywhere would bring the usefulness of OA all of a sudden into the minds of everyone in that community, without any PR efforts.

              This last point is actually an all too often neglected issue: OA advocacy is, in effect, PR and lobbying to do the ethically ‘right’ thing. This tends to be seen sort of like recycling: if you’re not stressed and there’s a recycling bin close by, you recycle, otherwise you don’t. In other words, it’s optional. There are far more pressing things, especially in the BioMed community where basically everybody is afraid of having to leave science. OA is the very last thing on people’s minds when they decide what to read or where to publish.

              If OA is to more than just a way of feeling morally superior, it needs to gain access into the workflow of people, to become an integral part of their daily life. PubMed is central to all BioMed researchers. One of the main problems of BioMed researchers is that we have to use many different search tools to find our literature. If you want to be sure you’re not missing anything, you have to use GScholar, PubMed, Thomson WoS and Scopus (depending on your exact field, you might need more or get away with fewer). If libraries worked with PubMed to allow people to get access to more literature, this would be a great step in a direction to have people find their literature in a single place.

              As future steps I could see a closer collaboration with libraries developing an improvement of the very fast and widely used PubMed to provide wider coverage than just BioMed and superior access to full-text (also for text-mining). Eventually, fully integrated IRs (when some other contingencies were met) have the potential to become the single go-to place for scholarly literature and a very deep wedge between scientists and publishers, demonstrating that the money spent on lining the publishers’ shareholders pockets would be better spent on libraries providing superior and cheaper services to their faculty.

              When I meet librarians I often (not always!) have the impression many of them are are too modest, fail to see the potential for all the obstacles and thus seem to be unable to ‘think big’. Now is an excellent time for libraries and faculty to take back control of the scholarly works they have ceded to publishers. Now is the time librarians could become again the custodians of scholarly works, instead of mere intermediaries. Now is the time to lay the foundation for the World Library of Science, the greatest archive of scholarly work this planet has ever seen. There is more than enough money to build this utopia – if we stop handing it over to shareholders.

            • Neil Stewart says:

              Thanks for this excellent comment Bjoern. I have posted a link back to it on the UKCoRR blog. I agree that “thinking big” can be an issue, and I think that my post does call upon colleagues to start thinking in this way too-hopefully it will gain some traction. The problem remains that we (IR managers) collectively would have to negotiate individually with each subject repository on behalf of all IRs. However, this shouldn’t stop us trying, perhaps by targetting one subject repository to begin with- and solutions found for one would probably scale across many. I’m going to try and synthesise this debate so it’s a bit more readable, and will post back again with a link when I’ve done so.

            • I’d love to comment more, Neil, but this is Stephen’s blog and I won’t abuse it any more. OpenID didn’t work on your post (using Firefox) and for my Google account it wanted me to create a blogger account, which I certainly won’t do – so I can’t comment there.

        • I should’ve mentioned that it works well in GScholar, which makes it even more unexpected that there isn’t an IR button in PubMed.
          Moreof a tangent: I use PubMed, GScholar, Thomson WoK and to some extent Scopus to find literature and PubMed beats them all hands down in terms of speed and general functionality (but lacks some specific functionality). If others have similar experiences, most people in this field will use PubMed and thus IR content at PubMed would have most impact.

        • Which is why there is so much effort going into getting people to agree contractually that it is infringing. See recent kerfuffle around Copyright Access and U Toronto for instance.

          • Stephen says:

            Ah – you mean this. In my head this sort of thing falls into the category of ‘does not compute’.

            • Yep, sorry I should have provided the link. That’s a classic case of why we need to pay up front. So that all of that nonsense just goes away. It’s sucking up vast quantities of energy and money that should be going into research. There is simply no need for a vast bureacracy to manage all of that copyright anymore.

  4. It might remain a challenge to convince the referees of grants to disregard the journals in which the applicants have published. The WT can stress this as much as it wants, but certain people will still be more impressed by a high-impact journal title than a specialist one, without bothering to delve into the caliber of the science. I wonder if routinely reporting the number of citations a paper has received would help? Some people do this but I still see lots of CVs without.

    • There is by now pretty good empirical evidence that high journal rank is associated more with unreliability (in a broad sense) than with any notion of quality (as measured by citations or expert opinion). I’m in the process of writing a review of that literature.

      • DrugMonkey says:

        How are you dealing with the common rejoinder that it is detection/scrutiny that differs with IF, not the quality/fraud, BB?

        • I’m citing two studies, one that shows that decline effect starts in hi-IF journals and the other that shows that effect sizes are overestimated preferentially in hi-IF journals with lower than average sample sizes.

          No idea about the quantitative contribution, but if visibility were the key factor in retraction detection, why is the citation rate of hi-IF papers so low? Clearly, visibility has an effect both ways, but the data suggest that it’s the (almost) exclusive driver in citations and a more modest driver in retractions/decline effect as the latter is much larger (i.e., the relative contribution differs, but the absolute, of course, is identical).

          First para is data, second is my interpretation/discussion. Once it’s not in an embarrassing state any more, I’ll open the manuscript up to some review before we submit it. Plan is to submit it to Nature and Science and solicit quotable rejection statements in case they don’t want it and then publish it in an OA venue with the quotes.

    • I think we need a multi-pronged approach here. The first is to encourage people to put more info on their CVs. Number of citations/bookmarks etc is a good start (but we could use tools to help us automatically build those CVs).

      Secondly we need to make the case at a higher level. As Björn points out there is now really good evidence that using journals as an assessment point is actually bad practice and bad science. There’s an organisational argument – you demonstrably can’t pick the best (whatever that means) people by using that measure. We need to keep pushing that case at lots of different levels. But equally we can expect that those who continue to use these measures will be outcompeted over the long term by those who take seriously the responsibility for making probably the most important strategic decision an institutional manager can make rather than try and farm it out to someone else.

    • Stephen Moss says:

      Jenny, you’re absolutely right. Judging the merit of a piece of work rather than the journal in which it is published is clearly a fine aspiration. Who could argue otherwise? But does anyone really believe this is what happens on grant panels or REF committees? If I were on such a panel and had to make an informed evaluation of Stephen’s recent paper in Structure, I would not find it straightforward. I could probably read and understand the paper, but I’m not sure I could honestly assess its ‘intrinsic merit’, particularly alongside a clutch of other structural papers. And for recently published work there are likely to be no/few citations.

      Veiled assurances from David Willetts that impact factors will not influence REF evaluations, as discussed here by RPG, are not convincing.

    • Stephen says:

      I agree Jenny (and Stephen M) that is still does seem a risky strategy for people to avoid impact factors when trying to publish. The WT’s statement is no solution but is at least a step in the right direction. If we can get RCUK and universities to sign up to the principle, we can travel even further along that road.

      I think that the more OA publishing comes to seem as ‘mainstream’ the more respectable it will become in people’s eyes.

    • Stephen says:

      Overnight, I’ve been reflecting further on the notes of hesitation sounded by Jenny and Stephen M (which echo points made in a recent post by DrugMonkey).

      The concern about abandoning the high IF journals remains widespread even though most people are aware of the inherent flaws in the system (see the good points made above by Björn and Cameron). So how do we get movement on this within the scientific community. I think the WT statement certainly helps — and would be strengthened if other funders adopted it. And so would a move to including citations in CVs (albeit that is beset with timescale problems) or looking at other metrics (such as downloads). I’m also in favour of pushing Nobel Laureates, FRS’s, HHMI investigators to lead the way in abandoning the slavery to IF and championing full OA.

      But what else could we do?

      • The data show that journal rank is doing the opposite of what people think – so we’d need to echo what Cameron said: Do you really want to risk your institution’s future by hiring someone with CNS papers? Changing culture will go a long way and data, at least for me, is what changes my mind the quickest.

        More to you point, though, what alternatives to we have? Well, as long as we don’t have a TCP/IP for science, as Cameron likes to put it these days, not very many. We need standards that apply across the board so innovation can take hold across the board. DOIs are one thing, the ORCID initiative another, but these kinds of technologies can only leverage a fraction of their full potential in a Balkanized information multiverse where literature is spread among thousands of universes and data is in another dimension altogether. Compounding the problem is that some of those universes and dimensions are financially unstable. The glacial pace at which ORCID is moving (but hey, it’s moving!) is testament to that fact and if ORCID is any reference, Cameron’s great-grandchildren might be old and gray before they see anything like a TCP/IP happen for science, let alone a Github.

        Those are some of the considerations why I suggest to get together with libraries, who are already publishing our theses and ancient texts, storing some of our data and our institutional repositories of scholarly journal articles to just expand these activities a little, using the fund they currently give to publishers. This would reduce Balkanization and further development and adoption of standards while potentially also saving a shitload of money in the process. The know-how is there, the infrastructure ins largely there and there is more than enough funds, so why not do it? Within a few years a few libraries would be able to offer the NIH to take over PubMed for free with complete coverage of all the scholarly literature and associated metrics and tools which would perform all the functions journals do today (they could even copy them, if demand were there), plus many, many more. All it takes is 10-20 libraries with the will to cut subscriptions on the order of KITs ten most expensive journals. That wouldn’t even disrupt access too much. With a total budget of around 1-2 million per year in addition to the infrastructure and know-how already available, one can easily come up with something the potential of which is so obvious, that many other institutions are likely to follow suit and exponentially increase the speed of development.

      • DrugMonkey says:

        BB is on the right track. It requires confrontation of the high IF mystique. Deconstruction of the process by which articles are accepted/ not. Illustration of the way labs that continually pursue high IF as an end in itself operate. Evaluation of the circularity of the “quality = high IF” argument.

        It will not be enough simply to say “Great papers are also found in society level journals”.

        • The data in our manuscript show two measures with increasing unreliability with IF, one strong and one not so strongly correlated (effect size overestimation and retractions). The data also show two rather weakly correlated measures (citation and expert opinion), suggesting that reader visibility alone is a weak predictor of anything. Instead, publication bias emerges as a main driving force behind scientific unreliability with journal rank (i.e., author visibility/prestige) as a main driving force.

      • Stephen Moss says:

        Extricating ourselves from the culture of Impact Factor will be an almighty task. Take the guidelines to applicants for European Research Council grants, who in their CV pages must detail their top ten papers “in major international peer reviewed multidisciplinary scientific journals and/or in the leading international peer-reviewed journals”. What kind of journals do we think they mean in this statement? And this is explicitly a scheme in which the initial appraisal of proposals is done by reviewers who almost certainly have no knowledge of your area of expertise. Now put yourself in the place of a reviewer faced with a stack of 50 such proposals who has to numerically score the relative strengths of each applicant. It is clearly impossible for the reviewer, who is intelligent but not a specialist, to read those 500 papers and determine the intrinsic merit of each.

        The same sort of evaluation is surely looming for UK scientists when it comes to REF submissions. On this comment thread I feel slightly guilty in confessing that for the time being my decisions as to where to publish will continue be driven at least partly by impact factor. Not until the Wellcome Trust and Research Councils publish data showing that the average number of Nature/Cell/Science papers published by applicants of failed grants is the same as for those whose grants are successful, will I start to believe that merit has become the decisive factor.

        • IF is so bad that an undergrad would be flunked if he/she used it, even if it were an actual measure and not negotiated between Thomson and the publishers. Almost anything out there today would be better. Pick at random and you can’t really get any worse. If I were a politician and got wind of that, I’d cut all funding until this trivial issue was resolved…

  5. Yes, very well said. It is without doubt the funders who will drive the move to an open access research literature, and it will be successful only if they provide the funding for this. And it could all happen very quickly if they do. But it’s crucial that author awareness goes hand in hand with research funder policies – in my experience there are issues here.

    When I was a journal managing editor we rigorously checked every submission to make sure all sources of funding were included. If they weren’t, we’d get back to authors for this information. All journals should do this, but unfortunately not all do. At the same time we’d note those submissions which included research funding from an organisation with a public access policy. For those that ended up accepted we’d very often have to remind the authors that they were eligible for funding to make their articles open access and we’d send them the link to the relevant page on their funder’s web site. It made me wonder whether there is any oversight of this by the research funders.

    Another issue is what happens with work resulting from multi-group, multi-disciplinary, multi-funded collaborations? HHMI has a bit of guidance on authorship in its Public Access Publishing Policy http://www.hhmi.org/about/research/QA_papp.pdf (general info at http://www.hhmi.org/about/research/journals/main?action=search)
    but I can see many potential problems if various situations aren’t anticipated and set out in research funder documentation and guidance. Authorship issues and disputes are already amongst the most common problems dealt with by journal editorial offices, and the range might be a surprise to many. I’ve seen everything from the not uncommon first-author disputes to the very unusual. Often problems arise because of lack of communication and agreement from the time groups start a research project, and by the time they get to publication there has been a serious breakdown in communication and sometimes in relationships.

    • Stephen says:

      Good point Irene – authors can be lackadaisical about checking acknowledgement of funding. In part it probably seems like a tiresome chore at the end of a long process of doing and writing up the research and getting it through peer-review. But funders simply have to lay down the law: that grant ref numbers should be included or there’ll be penalties.

      The point about author disagreements is also pertinent. There is the wider issue of disputes over authorship and problems of communication (for another day perhaps). But divergent views on the importance of OA could also be a source of dispute between collaborators (especially if they are not at different stages of their career).

  6. Excellent Post and good discussion
    I want to stress (to the extent of boredom) the need to allow automatic text-, data- and image-mining. The Wellcome policy makes it clear how important this is. For example we now have software than can extract chemical structure diagrams and index documents on that basis, work our reactions, analyse recipes, etc. But the publishers are putting every possible obstacle in our way. If we simply accept human-eyeballs-only then we shall fail to put in place the major innovations which are now possible.

    It’s mainly anecdotal but I understand that publishers put huge restrictions on what Pubmed and UKPMC can do. There are only certain places where documents can be put, you can’t add links, can’t add annotations, etc. Graphical abstracts can’t be posted. The ONLY way to challenge this is by CC-BY or CC0 and the only current way to do this is Open-Access journals. The hybrid options are awful – the 5000 USD you pay will NOT buy full rights for the reader. It’s a disgrace that the community has failed to fight for full value from Hybrid – though I don’t like hybrid anyway.

    I am collating a response to Hargreaves (on changing UK law for science) – interested to hear from anyone who is interested

  7. Stephen (M.), that’s something I hadn’t considered…grants tend to ask you for recent publications, and you won’t have high numbers of citations racked up yet. It’s a real problem, isn’t it?

    • Stephen says:

      It’s a real problem but there are real solutions. Firstly any assessment of the number of citations should take account of the time since publication. Such numbers can be rather stochastic and shouldn’t any case be relied upon. There is much talk about alternative metrics – but it’s not something I have looked into in great detail yet. I’ll bet Cameron Neylon has, though.

      The other way is to read the paper or — in the case of my application for promotion to Prof — ask the candidate to identify their 4 most significant papers and to explain why (in around 200 words or so).

      • I was going to hold fire on that one 🙂 But yes there are some emerging options that give you some information much more rapidly than citations. Bookmarks and social media engagement are showing some promising results with numbers in the weeks following publication showing varying degrees of correlation with citations at 2 years. Of course we don’t yet really understand what they mean and they have some distinct biases…a bit like citation counting really…

        But in the end, nothing will tell you what the value of a piece of work is except waiting for 50-100 years and tracking its influence in all the different parts of life it had an effect on. In the meantime, we’re always going to be stuck with imperfect measures of “quality” or “impact” but as long as we recognise than and work with the data in the way we do in our day to day work to decide what conclusions we are comfortable with basing on them, we should be ok.

  8. rpg says:

    Of course, this has nothing to do with Wellcome’s shiny new OA journal, has it?

    • Stephen says:

      Actually – I realise now I’m not sure how old their policy is. Though it’s true to say that Wellcome’s practice of fully funding OA charges is a few years old. They are well ahead of the pack.

      • Frank Norman says:

        Wellcome’s policy goes back to 2005. MRC’s pretty-much identical policy came a year later in 2006. For some reason that delay means that people often forget that MRC has the same policy as Wellcome on Open Access!

        • Stephen says:

          Thanks for the clarification Frank though I believe I’m right in saying that the MRC adopts the same clunky and ineffectual funding model as the BBSRC. Part of the strength of Wellcome’s position derives from it putting its money full square behind its intentions.

          • Frank Norman says:

            Stephen

            Yes, with regard to funding mechanisms, the RCs are hampered by the Treasury rules (citation needed). Wellcome has the great advantage of being free to do whatever it likes with its money. The dead hand of Government weighs down the RCs.

            It is also true though that some institutions (e.g. Nott Univ) have managed the funding that is available from RCs better than others, so Universities need to share best practice about managing OA income and payments.

    • No. The OA policy is a good few years old now, I think about five, possibly even more. The new journal is actually a response to their policy – a way of delivering on it better than they are so far. In many ways I see the journal as a signal that Wellcome is bored with waiting for the research community and publishers to get their act together and make OA happen.

  9. Joy Davidson says:

    This new report from the SURF Foundation in the Netherlands may be of interest with regards to new ways of measuring research impact.

    ‘Users, narcissism and control – tracking the impact of scholarly publications in the 21st century’
    http://www.surffoundation.nl/nl/publicaties/Documents/Users%20narcissism%20and%20control.pdf

  10. Stephen says:

    To reply to Neil and Bjorn above, please feel free to continue the conversation here. I’m only sorry I’ve been too busy to keep up. But I am interested in this issue of accessing institutional repositories since I have run into difficulty in getting my latest paper put up onto my university repository. There is some confusion about what Elsevier allows. More anon…

    • Neil Stewart says:

      Thanks Stephen! I’d be interested to hear the Elsevier story. They’re actually pretty liberal when it comes to green self-archiving- no embargo period before the paper is made openly accessible, unlike many other big publishers!

      • Mike Taylor says:

        Elsevier’s position on self-archiving is a bit strange, actually. As I understand it, it amounts to “you’re allowed to self-archive unless your institution requires you to, in which case you’re not allowed to”.

        Hmm, now I write that down in black and white it seems too dumb to be true. Have I misinterpreted it? If not, can anyone explain it?

      • Stephen says:

        I have a query with the publisher at the moment asking for clarification on what my posting rights are. I agree with Mike that the position is potentially anomalous.

        No reply yet but I’ll chase them today. If I don’t hear soon, maybe a blogpost will help to elicit a response…

Comments are closed.