The University of Dundee has signed and expressed its commitment to DORA, a set of principles designed to ensure that research assessment is undertaken in a responsible, fair and transparent manner.
Read the University of Dundee statement here.
What do you need to know about DORA?
The vision of DORA is to “advance practical and robust approaches to research assessment globally and across all scholarly disciplines”. This approach places value on all forms of scholarly outputs and activities, research outcomes, impact, and on the individual researchers themselves by focussing on the merits of the work and reducing reliance upon quantitative methods of assessment such as Journal Impact Factors (JIFs).
To be clear, DORA does not say that JIFs shouldn’t be used in any capacity. They cannot however be used in isolation and as a short cut to assessment of either the research itself, or of a researcher. They certainly shouldn’t be used as a quick fix during hiring, promotion, or funding decisions. Instead, they should form a small part of a broad and responsible approach to assessment.
Consider these points:
JIFs were originally created to help librarians decide which journals to purchase, not as a scientific measure for assessment, nor as an indication of quality of individual articles.
- JIFs are flawed, with imbalances caused by the types and quantity of content in each journal.
- JIFs do not accurately reflect differences in subject matter and discipline and should not be used to compare journals from different topic areas.
- The time frame used to calculate JIFs (between 2-5 years) does not present an accurate picture of overall citations, particularly of those garnered over a longer period.
- JIFs can be gamed.
Responsible Use of Metrics and our approach to assessment:
Apart from removing a reliance upon JIFs, an approach to the responsible use of metrics covers a range of principles which should ensure transparency and equality. These principles, broadly speaking, mean that we should be transparent about which metrics are being used in each assessment process. For example, metrics should not replace qualitative expert assessment; no single metric should be used in isolation; assessment should be carried out by a diverse panel of experts; different types of outputs should be considered, and those metrics which are used should be regularly re-assessed to make sure they are fit for purpose.
Record all of your research outputs and activities in Discovery, including Datasets, Public Engagement, and additional roles that relate to your research.
Seek advice from the University Research Impact Manager about the best approach to recording evidence of your impact.
Engage with new methods of research dissemination and open research practices to extend the reach of your work to as wide an audience as possible.
Consider the use of CRediT, a taxonomy designed to ensure credit is given in research publications to all types of contributors.
Maintain an awareness of good practice with regards to the responsible use of metrics. More information can be found in the summary of the 2015 report, The Metric Tide.