Responsible metrics at Kent – so what?

'To do' over a blank post it note.

We have, as an institution signed the San Francisco Declaration on Research Assessment and adopted the Leiden Manifesto  – what do I need to do?

Both of these initiatives have come from groups of researchers who share a concern with the way the outputs from research are evaluated by funders, academic publishers, institutions and other parties. While appreciating the value of quantitative research metrics, they are concerned about their inappropriate use in decision making, for example around the allocation of funds and in academic career progression.

The Leiden Manifesto

The Leiden Manifesto presented in Nature, 2015, (Nature Volume:520, Pages:429–431 Date published:(23 April 2015)DOI:doi:10.1038/520429a ) brings together accepted but disparate principles of good practice in research evaluation. The manifesto represents the “distillation of best practice in metrics-based research assessment so that researchers can hold evaluators to account and evaluators can hold their indicators to account”

1) Quantitative evaluation should support qualitative, expert assessment.

We recognise the value of quantitative indicators to support qualitative, expert peer review. Indicators may be used in a variety of processes but indicators will not supplant expert assessment of both research outputs and their environment.

2) Measure performance against the research missions of the institution, group or researcher.

While research is one key strand of the University Plan, we recognise that many research outputs are aimed at a specific, and often non-academic audience. We celebrate the value of research outputs in public engagement, creating impact, and disseminating research findings to research users. We commit to considering research outputs within the context of the centre, school, faculty, and University research environment and the original goals of the researcher.

3) Protect excellence in locally relevant research.

Many citation counting tools and other quantitative indicators are inherently skewed to English language publications. We celebrate the diversity of our researchers at Kent, and the international nature of much of our research and encourage publication in the appropriate language for the research users, whether academic, practitioner or the general public.

4) Keep data collection and analytical processes open, transparent and simple.

Where quantitative research quality indicators are used, we will aim for a balance between simplicity and accuracy. We will aim to use tools with published calculation methods and rules. A list of relevant measures, with their advantages, disadvantages and potential uses, is provided as part of the Metrics toolkit.

5) Allow those evaluated to verify data and analysis.

We encourage researchers to question the indicators used in relation to their research outputs and to be empowered, both with the necessary understanding and appropriate processes to offer alternative positions. Researchers should register for an ORCID iD to ensure consistent, reliable attribution of their work and work with the Research Excellence Team and Office of Scholarly Communication to ensure their details in KAR and external databases, like Scopus, are accurate.

6) Account for variation by field in publication and citation practices.

We recognise that scholarly communication methods vary widely between disciplines and quantitative metrics work better for some forms of research output than others. We also recognise that the time frames involved will vary according to both outputs and discipline and that there are often significant variations within fields, especially for those at the intersection of faculties. We will not promote the use of one measure over another and the availability or otherwise of bibliometric or other data will not drive our decision making about research activities and priorities.

7) Base assessment of individual researchers on a qualitative judgement of their portfolio.

Research quality indicators are affected by career stage, gender, and discipline and we will take these factors into account when interpreting metrics. We recognise that academic staff undertake a wide range of research communication activities, not all of which can be easily measured or benchmarked. When assessing the performance of individuals, consideration will be given to as wide a view of their expertise, experience, activities and influence as possible.

8) Avoid misplaced concreteness and false precision.

Quantitative research metrics will be used as guides at Kent, not as decisive measures of the quality of a research output. Multiple sources will be used to provide a ‘more robust’ and broader picture, taking into account the wide variations in the quality and scope of the data. We will establish the context for the metrics used, highlighting the factors taken into account, and we will avoid using over precise numbers that give an illusion of accuracy.

9) Recognize the systemic effects of assessment and indicators.

In order to account for the inevitable incentives established by measuring particular aspects of research, a range of indicators will be used, and we will aim to be transparent in relation to the biases associated with particular sources. We will encourage researchers to use available tools and guidance relating to indicators, with their advantages, disadvantages and potential uses: e.g. Metrics toolkit

10) Scrutinize indicators regularly and update them.

As the range and appropriateness of quantitative research indicators evolve, the ones we use will be revisited and revised.

The San Francisco Declaration on Research Assessment (DORA)

The San Francisco Declaration on Research Assessment (DORA) is a worldwide initiative. Involving all key stakeholders in the research process in finding ways to improve the methodology around the evaluation of scholarly research outputs. DORA puts forward a range of practices which endorse the principles of the Leiden Manifesto. DORA’s recommended practices are grouped for each key community involved in producing and assessing research outputs.

The particular recommendations for researchers are:

15. When involved in committees making decisions about funding, hiring, tenure, or promotion, make assessments based on research content rather than publication metrics.

16. Wherever appropriate, cite primary literature in which observations are first reported rather than reviews in order to give credit where credit is due.

17. Use a range of article metrics and indicators on personal/supporting statements, as evidence of the impact of individual published articles and other research outputs.

18. Challenge research assessment practices that rely inappropriately on Journal Impact Fact

ors and promote and teach best practice that focuses on the value and influence of specific research outputs.

'To do' over a blank post it note.
What do I do…?

 

 

Generally:

Attend training

Over the next few months and years, we will be providing training on responsible research indicators. We are looking at structuring these with some specific training appropriate to discipline, some to career stage, and some to roles (such as promotions committee, REF co-ordinator) as well as some general training. Please do come to these sessions – we are keen that researchers at Kent are empowered, both with the necessary understanding and appropriate information and processes to use quantitative research indicators appropriately and to offer alternative positions regarding indicators used in relation to their work.

Let us know

Where you find practice at Kent to contravene the principles contained in the Leiden Manifesto or DORA, please contact us – we are keen to provide a route for researchers and professional service staff to report policies, procedures and behaviours so that we can prioritise the review of these.

Single authoritative source

When you are sharing your research, where possible, consistently link to the record of the work on the publisher’s site– it helps to point all readers to the same source to download the output, keeping the figures concentrated in one place.

Tools such as Kudos can also help you track your engagement and dissemination, both in open forms of communication (such as media or policy reports) but also through measuring clicks on links from mailing lists or emails.

Make sure your data are correct!

Check KAR to ensure all your research works are recorded there correctly. Researchers should register for and use an ORCID iD to ensure consistent, reliable attribution of their work and work with the Research Excellence Team to ensure your details in Scopus are accurate – this is particularly the case if you have recently changed name or institution.

Leave a Reply