Careers

Big journals acting as gatekeepers to discoverability

From the LSE Impact Blog

Reading Time: 5 minutes
Big journals acting as gatekeepers to discoverability

One of the proposed advantages of open access publication is that it increases the impact of academic research by making it more broadly and easily accessible. Reporting on a natural experiment on the citation impact of health research that is published in both open access and subscription journals, Chris Carroll and Andy Tattersall, suggests that subscription journals still play an important role in making research discoverable and useful and thus still have a role to play even in open publication strategies.


‘Sometimes you have to do what you don’t like, to get to where you want to be.’ 

Tori Amos

We recently completed a citation analysis of 10 years of randomised controlled trials published by the UK National Institute of Health Research (NIHR). The biggest public funder of trials in the UK produces their own monograph series, the wholly open-access Health Technology Assessment (HTA) journal and we wanted to analyse the reach and impact of these works when republished in subsequent commercial journals. Such trials cost a great deal of money, so there is a clear interest in determining the impact of this research.

Impact can of course be assessed in many ways, but we decided to assess the health policy impact of these trials using citation analysis, by looking at how many times they were cited in key policy documents, or types of research that are known to inform policy, that is systematic reviews and meta-analyses. We also looked closely at how the trials were used in these documents. After all, trials are conducted to help decision-making by health professionals and policy-makers. They need to be easily discoverable and useful. 

The sample was 133 trials published by the NIHR from 2006 to 2015. As noted above, these trials were all published in the NIHR’s own open-access HTA journal (a model of making publicly-funded research available to the public since its first volume in 1997). The HTA monograph is a peer-reviewed journal with each issue dedicated to a single project – such as a randomised controlled trial – and contains the full report of each trial. This might include not only the trial’s effectiveness findings, but also an economic evaluation and, in some cases, additional but related work, such as a qualitative study. These separate elements of the project might also be published in other peer-reviewed journals, which have paywalls and more restrictive word-counts, but also have the potential to increase the visibility and discoverability of the research.

Some of the trials published in our sample (82/133) had elements of the trial research published separately in traditional ‘subscription’ journals, such as The Lancet and the British Medical Journal (BMJ). These are the ‘big’ journals in medicine and public health with massive readerships and impact factors (from around ‘25’ to ‘60’, compared to approximately ‘4 or 5’ for the HTA journal). When conducting the analysis, we included these additional publications of the trials in the ‘impact’ assessment.

The citation analysis findings for these additional publications outstripped the impact numbers for the HTA journal. We found that these related publications achieved twice the mean number of citing reviews and more than four times the mean number of citing policy documents than the HTA journal publication: 125 vs 25 citations per trial; 7.16 vs 3.32 reviews per trial; 3.59 vs 0.80 policy documents per trial. These additional publications therefore appeared to generate much larger numbers of key citations, in policy documents and reviews, compared with their equivalent HTA journal publications.

So, can we conclude that the additional publication of elements of these trials in subscription journals, including possibly paying their open access charges, enhanced their impact? Well, not really. Good quality research and guidance documents should have found the HTA publication and its data anyway (a proportion cited both the HTA and its additional publication). A direct comparison of citation data to answer this question unequivocally is not possible. However, the numbers for the additional publications, including unique citations in policy documents, are large enough to be compelling. 

Publishing trial data in big journals such as The Lancet and BMJ might make the data far more ‘discoverable’, and thus enhance the potential impact of publicly-funded research. There might therefore be value for researchers, policy-makers and the public in a publishing model that combines full open-access publication (a must for publicly-funded research, surely) with selective additional publication in certain, select, influential subscription journals (while being aware that ‘salami slicing’ publication strategies do not necessarily represent ‘good practice’). Indeed, public funders could maintain their own lists of appropriate journals for publication of their research, in addition to the open-access versions, though of course, for some, that raises questions concerning academic freedom. Of course, the ideal would be for funders to develop their own high-impact, wholly open-access journals that can compete with the ‘big’ journals but, even if this could be realised, it is a way off. In the meantime, public funders of research should very selectively exploit the system that is there, by ‘using’ the publishers (and certain journals only). This is an arguably a novel ‘reversal’ of the current perceived relationship between publishers and academics,  where the author supplies what is in effect ‘second hand’ content to be considered by the journal. In normal circumstances, such works could be declined due to the publisher not having first access to the research. Yet it is the high intellectual value of these works that not only ensures they are published again, but they are given a place in high impact journals. 


About the authors

Chris Carroll is Reader in Systematic Review and Evidence Synthesis at The School of Health and Related Research at The University of Sheffield.

Andy Tattersall is an Information Specialist at The School of Health and Related Research (ScHARR) and writes, teaches and gives talks about digital academia, technology, scholarly communications, open research, web and information science, apps, altmetrics, and social media. In particular, their applications for research, teaching, learning, knowledge management and collaboration. Andy received a Senate Award from The University of Sheffield for his pioneering work on MOOCs in 2013 and is a Senior Fellow of the Higher Education Academy. He is also Chair for the Chartered Institute of Library and Information Professionals – Multi Media and Information Technology Committee. Andy was listed as one of Jisc’s Top Ten Social Media Superstars for 2017 in Higher Education. He has edited a book on altmetrics for Facet Publishing which is aimed at researchers and librarians. He tweets @Andy_Tattersall and his ORCID ID is 0000-0002-2842-9576.

Leave a comment

Your email address will not be published. Required fields are marked *

Translate »