UCL is currently developing a policy on the responsible use of bibliometrics.
The draft "Principles on the responsible use of bibliometrics in the era of Open Science" was presented and discussed at the Bibliometrics Town Hall meeting in December 2018. It sets out background on the need for a policy, and a set of principles to guide and support the appropriate use of metrics at UCL. The consultation about the policy is currently ongoing and comments or feedback are welcome at firstname.lastname@example.org.
Further guidance, currently under development, will cover practical advice on putting these principles into practice - taking into account the needs of UCL authors, researchers and colleagues and their use of current bibliometric tools, and existing best practice elsewhere.
The wider context
Bibliometrics are becoming a common way to analyse and assess research, but are often used inappropriately or misleadingly, often with good intentions. Bibliometric data is very sensitive to the assumptions made about interpreting it, and there are many factors that complicate the use of bibliometrics - for example, as they are focused on citation data from journal articles, they are less relevant in disciplines that are less reliant on journal publishing, such as the arts and humanities, or computing science and engineering.
A number of initiatives have been put forward to guide the responsible use of bibliometrics, several of which have been used to develop the draft UCL policy.
- DORA (San Francisco Declaration on Research Assessment)
The San Francisco Declaration on Research Assessment (DORA) was created in 2012 by the American Society for Cell Biology (ASCB) and a group of editors and publishers of scholarly journals. The declaration recognises the need to improve the ways in which the outputs of scientific research are evaluated, including putting less reliance on the Journal Impact Factor as a measure. Individuals and organisations concerned about the appropriate assessment of scientific research are encouraged to sign the declaration. UCL was one of the first UK universities to have signed DORA.
- Leiden Manifesto for Research Metrics
The Leiden Manifesto for research metrics, published in Nature in 2015 by a number of bibliometric specialists, sets out ten principles for the responsible use of metrics. It goes beyond the initial focus of DORA on journal-level metrics to address broader themes, such as the need to protect locally significant research, avoid false precision, and to account for the systemic effects of using specific metrics. Many institutional metrics policies, including that of UCL, draw heavily on the principles in the manifesto.
- HEFCE Metric Tide report
The Metric Tide report was produced by an expert group for HEFCE (now Research England) in 2015. It was set up to investigate the roles that quantitative indicators can play in the assessment and management of research, particularly in the context of the REF. The report concluded that although it is not feasible to assess the quality or impact of research outputs using quantitative indicators alone, the approach used in REF2014 of using quantitative data to complement peer/expert review should be continued and enhanced in future assessments.
Alongside this, it established the concept of "responsible metrics" to define the appropriate uses of quantitative indicators for the governance, management and assessment of research, setting out five key aspects of responsible use (robustness, humility, transparency, diversity, and reflexivity).
- Snowball Metrics
Snowball Metrics is an international initiative for research-intensive universities from around the globe to agree on methodologies for metrics to enable confident comparisons.
Bibliometrics and the REF
Research outputs submitted for the REF2014 were assessed by expert review sub-panels, who could use citation data to inform their peer review judgements. Citation data were requested by around 30% of the sub-panels, mainly in the life or physical sciences areas, primarily to inform borderline cases. The Metric Tide report reviewed this practice, and recommended that in the future quantitative data should have a place "in informing peer review judgements". A subsquent internal review, the Stern report, (Research Excellence Framework (REF) review: Building on success and learning from experience, 2016), recommended that "Panels should continue to assess on the basis of peer review. However, metrics should be provided to support panel members in their assessment, and panels should be transparent about their use."
Research England subsequently confirmed that for the 2021 REF some sub-panels would be provided with citation indicators for submissions, which would be used to inform but not dictate assessment of papers, in line with these recommendations. (see paras 274-82, REF 2019/2)