UCL has developed a policy on the responsible use of bibliometrics.
UCL policy consultation
UCL has developed a policy to govern the responsible use of bibliometrics. The policy outlines principles for the use of bibliometrics at UCL, setting out that while the use of metrics is not mandatory, in cases where they are used they should meet certain requirements. For example, the impact factor of journals should never be used to assess individual publications.
Detailed information about the policy, and guidance to give practical advice on putting these principles into practice, is available.
The policy was developed through a consultation process in 2018-19. It was presented and discussed at the Bibliometrics Town Hall meeting in December 2018, followed by consultation with academic departments through early 2019. From June to September 2019, we held a wider consultation on the proposed policy, to which all UCL staff were invited to contribute. The consultation showed broad support for the proposal, but identified several points which required further revision. An overview report on the consultation is available, outlining the responses and the changes made as part of the consultation, and a detailed case study of the process was produced for the DORA community.
The draft policy and guidance were approved by Academic Committee as a UCL policy in early 2020.
The wider context
Bibliometrics are becoming a common way to analyse and assess research, but are often used inappropriately or misleadingly, often with good intentions. Bibliometric data is very sensitive to the assumptions made about interpreting it, and there are many factors that complicate the use of bibliometrics - for example, as they are focused on citation data from journal articles, they are less relevant in disciplines that are less reliant on journal publishing, such as the arts and humanities, or computing science and engineering.
A number of initiatives have been put forward to guide the responsible use of bibliometrics, several of which have been used to develop the draft UCL policy.
- DORA (San Francisco Declaration on Research Assessment)
The San Francisco Declaration on Research Assessment (DORA) was created in 2012 by the American Society for Cell Biology (ASCB) and a group of editors and publishers of scholarly journals. The declaration recognises the need to improve the ways in which the outputs of scientific research are evaluated, including putting less reliance on the Journal Impact Factor as a measure. Individuals and organisations concerned about the appropriate assessment of scientific research are encouraged to sign the declaration. UCL was one of the first UK universities to have signed DORA.
- Leiden Manifesto for Research Metrics
The Leiden Manifesto for research metrics, published in Nature in 2015 by a number of bibliometric specialists, sets out ten principles for the responsible use of metrics. It goes beyond the initial focus of DORA on journal-level metrics to address broader themes, such as the need to protect locally significant research, avoid false precision, and to account for the systemic effects of using specific metrics. Many institutional metrics policies, including that of UCL, draw heavily on the principles in the manifesto.
- HEFCE Metric Tide report
The Metric Tide report was produced by an expert group for HEFCE (now Research England) in 2015. It was set up to investigate the roles that quantitative indicators can play in the assessment and management of research, particularly in the context of the REF. The report concluded that although it is not feasible to assess the quality or impact of research outputs using quantitative indicators alone, the approach used in REF2014 of using quantitative data to complement peer/expert review should be continued and enhanced in future assessments (see below).
Alongside this, it established the concept of "responsible metrics" to define the appropriate uses of quantitative indicators for the governance, management and assessment of research, setting out five key aspects of responsible use (robustness, humility, transparency, diversity, and reflexivity).
- Snowball Metrics
Snowball Metrics is an international initiative for research-intensive universities from around the globe to agree on methodologies for metrics to enable confident comparisons.
- Bibliometrics in the REF
Research outputs submitted for the REF2014 were assessed by expert review sub-panels, who could use citation data to inform their peer review judgements. Citation data were requested by around 30% of the sub-panels, mainly in the life or physical sciences areas, primarily to inform borderline cases. The Metric Tide report reviewed this practice, and recommended that in the future quantitative data should have a place "in informing peer review judgements". A subsquent internal review, the Stern report, (Research Excellence Framework (REF) review: Building on success and learning from experience, 2016), recommended that "Panels should continue to assess on the basis of peer review. However, metrics should be provided to support panel members in their assessment, and panels should be transparent about their use."
Research England subsequently confirmed that for the 2021 REF some sub-panels would be provided with citation indicators for submissions, which would be used to inform but not dictate assessment of papers, in line with these recommendations. (see paras 274-82, REF 2019/2)
There is also a growing awareness among institutions of the need for a responsible metrics policy, and UCL is not alone in developing one. As of May 2019, around a dozen British universities had made some form of clear policy statement on their use of metrics.