Results from the 2021 Responsible Metrics State of the Art Survey

The results of the seventh annual Responsible Metrics State of the Art survey are in! With a little delay on our usual schedule of publishing, please find below the summary of the results of both the 2021 survey and some reflection on the trends since the survey was first begun by the LIS Bibliometrics community in 2015. Prior year survey results are also available.

Response demographics

A total of 91 responses to the 2021 survey were received. This is fewer than were received in 2020 (139 respondents), which itself was a substantial decline from 2019 (218 respondents). Whilst we don’t know the reason for this decline (Pandemic workloads? Survey fatigue?), we are still very thankful to those who did respond and I hope many will find these insights useful.

The range of countries represented in the responses show 73% of respondents recorded as working outside of the UK.

Figure 1 Distribution of respondents by reported country of work

86% of respondents work in the university sector; with a minority of responses split between publishers, private research organisations, non-profits, companies and hospitals. Within the university group responses, it is interesting to see the split between their reported departments: Research Office (26%); Library (21%); University Management (20%); Academic Staff (17%); and smaller responses spanning HR, union roles, planning and quality assurance. This range of departments and role types suggests a diversity in university approaches towards implementing responsible metrics, as well as the way that these issues cross academic, information services and people services in organisations.

DORA

The share of respondents who report that they have already or are likely to sign DORA continues to increase year on year from 30% in 2019; 32% in 2020 to 40% in 2021. Contextual factors around this might include the Wellcome Trust explicitly naming DORA as a route for funding organisations to comply with their open access policy in 2018 and explicit naming of DORA by cOAlition S. Of the 17% of respondents who reported actively considering but not yet decided, comments regarding consideration of “disciplinary perspective(s)” and potentially signing DORA at a “‘unit’ level rather than a university level” indicate potentially different perspectives on DORA across faculty or school lines.

Figure 2 Responses to DORA
Figure 3 Adoption of DORA 2015 – 2021

Although it was a small cohort of responses (7); 100% of respondents that classified themselves as Publishers or Companies responded as either already signed or likely to sign DORA. DORA has five principles for publishers and four for organisations that supply metrics (and not all of the company respondents might be this type of company), compared to only two principles for research-performing organisations. It is interesting to consider whether the pressures and motivations for publishers and metrics organisations to meaningfully engage with DORA are different to those of research organisations, and the position of the vendors in this sector that choose not to.

Development of institutional principles

Organisations can also develop their own set of responsible metric principles, and the percentage of respondents reporting that they have created or are developing their own principles has increased consistently, rising to 44% of respondents in 2021. Organisations may also both be signatories of DORA and develop their own principles.

Figure 4 Development of institutional principles 2015-20

Institutions who have developed or are in the process of developing their own principles were asked whether their policies were based on existing documentation. Organisations may have based their own principles on more than one existing documentation, and the mixture of guidance that many reported suggests organisations have a rich mix of inspiration and experience to draw from. Looking at 2021’s responses in detail we see the continued popularity of the Leiden Manifesto in influencing other organisation’s principles; it being cited as a basis for 71% of respondents in 2021; up on 68% in 2020 and 42% in 2019. DORA is also a substantial reference point, cited by 68% of respondents as an influence for their institutional principles in 2021 and having consistently increased share of this year on year since 2016. With consideration to the limitations and international respondents of this survey, it was interesting to note the decline between 2018 to 2021 in respondents citing Metric Tide as an influence – possibly the forthcoming Metric Tide Revisited will prompt change in this direction. A new influence for responsible metric principles is the INORMS SCOPE Framework, with 10% of respondents citing it as an influence in 2021, two years after its introduction in 2019.

Figure 5 Cited influences on institutional principles
Figure 6 Cited influences on institutional principles 2016-2021

Changes affected by statements or policies

Respondents were invited to describe in free text how they perceive that a responsible metrics policy or statement has affected use of metrics in practice at institutions.

Of the 34 respondents to this question, five reported that principles had not yet affected metrics practice at the institution, with one comment “we’re only just beginning”, and three describing that reviews or statements were in progress, with action to follow after this. This could be an indicator of the slowness of this kind of institutional culture and process change, with some respondents commenting that “progressively they (the principles) are being incorporated into different procedures” and “slow adoption”.

The remaining other usable responses to this question however all point to tangible change on metrics use at the institution. 24 respondents described positive responsible metrics use:

  • “(We) have productive conversations with senior academics about the appropriateness of research assessment activities… top level improvements”
  • “No use of Impact Factors”
  • “No reports are issued without metrics advice and guidance, requests for Impact Factor or Cite Score are politely refused and alternatives offered. Lots of education on metrics in courses offered to staff and postgraduate students”

And a further four institutions stating that the changes have directly affected actual hiring, appraisals or promotion process, with an example from one response (the responsible metrics policy) governs all of metrics usage for KPIs (Key Performance Indicators), appraisal, management information, recruitment and promotion”.

Response by academic staff

The survey asked respondents how, in their opinion, have responsible metrics principles been received by the academic community. Remembering that as stated above, 17% of respondents to the survey declared themselves as members of academic staff, these responses should be considered contextual to the fact that the majority of respondents are not academic staff but may work closely with them, or may not directly work with them at all.

Figure 7 Perception of academic response to principles

Almost 40% of respondents perceive the academic response to responsible metrics principles as ‘mixed’ – comments referring to difficulty changing culture and the adherence to the Journal Impact Factor, “they understand it (the policy) but it’s going to take a long time to change culture and move away from the Impact Factor” and “unease regarding having to change quite embedded practices such as the Impact Factor”. Raised by more than one respondent was persistent use of Journal Lists in Business/Management faculties.

Caution for anybody involved in implementing a responsible metrics policy is raised by one comment that negative responses had been, in the respondent’s opinion, “exacerbated by incorrect use of research metrics in redundancies” and another comment sharing negative responses to the responsible metrics policy due to incorrect understanding of its relationship to REF Scoring.

Positive responses (29%) and Neutral (25%) suggest acceptance, understanding and engagement in the form of support for the policy, attendance at responsible metrics education offerings, and engaging with the bibliometrics or library staff on these issues.

Monitoring compliance

27 usable responses were returned for the question ‘how do you monitor adherence to your policy?’ with 17 broadly defined as ‘No’ or ‘Not currently’ with one stating “We are working on this currently but it is very hard to police”

The remaining 10 responses indicating the presence of some kind of compliance monitoring, the responses range from “a centrally managed research reporting mechanism”, to multiple responses of “a whistle blowing process”, “an inbox”, “staff named to be responsible for monitoring the policy”; and named staff examples from different respondents include a member of Library Services responsible for research publications, Research Office staff and two members of research committees.

Other responses indicate a more informal, shared sense of responsibility, describing that it is “self-policedor “we hold each other accountable”.

The relatively small proportion of overall respondents who submitted to this question indicate that this is perhaps an area that institutions are less willing or able to share information on. It is certainly a topic that, from my anecdotal experience, is much less openly discussed in the community, for example online or at conferences, in contrast to how much development or application of responsible metrics principles is shared.

Data sources and tools

The data sources that respondent institutions could select to indicate they use them to derive metrics reports show sustained popularity of Scopus (23%) and Web of Science (20%) from prior year reporting as well as institutional platforms (the CRIS and Institutional Repository) and Google Scholar as sources.

Figure 8 Data sources used for accessing data for metrics

Respondents were also asked which tools they use showing a wide variety of types of tools and different analysis styles used in the sector. The popularity of tools such as SciVal, Altmetric and InCites shows the preference for tools that do not need coding skills to retrieve data and can create indicators with minimal construction and maintenance. SQL, VOSViewer and Bibliometrix/Biblioshiny demonstrate the expertise and resourcing in the bibliometric sector for data science and programming skills to create custom solutions. The presence of Google Big Query and PowerBI are particularly interesting as neither appeared in the 2020 survey results, yet now with increasing use of these suggest that alongside Tableau, institutions are increasingly interested in creating their own business intelligence and analytics dashboards, perhaps for self-service access to data from users that can be connected to institutional data warehouses and merge together multiple data sources.

Figure 9 Tools used to construct metrics

Conclusions

Two really positive continued trends from 2015 to 2021 are both the increased share of respondents stating that their institution have created their own responsible metrics policy (40% of respondents in 2021) and are already or will commit to DORA (40% of respondents in 2021). This, in my interpretation, suggests that there is drive in institutions to meaningfully construct their own value-led evaluation approaches as well as to engage with and seek strength from the global DORA programme. The intersection with other frameworks such as the Leiden Manifesto and adapting from other institution policies also suggests a flexibility and sharing culture. 

Some of the impacts cited by respondents of their responsible metrics policies on their institutional processes are quite remarkable, with a few individuals describing positive, tangible change having affect on HR and assessment procedures. Some, of course, also admitted the slowness of these changes and some of the persistent attachment to the Impact Factor in evaluation. Areas where there is perhaps not yet action to report, or an unwillingness to report, include compliance mechanisms. This may be connected to the 25% of respondents who reported that the academic response to responsible metrics policies was ‘neutral’, suggesting  that actually, this is difficult to capture and categorise.

The declining number of respondents submitting to this survey from 2019 to 2021 is a cause for the authors to contemplate if this is still a useful resource created from the voluntary time of the LIS-Bibliometrics Committee. If you are a reader and feel either way, please let us know! Comments, as ever, are open, as is the listserv and our Twitter account.

Contributions: 2021 survey constructed and distributed by Nicolas Robinson Garcia, based on historic surveys constructed by Lizzie Gadd; with 2021 analysis and writing by Robyn Price.

Robyn Price (@robyn_price_) established a responsible bibliometric analysis and education service at Imperial College London. Robyn is currently Co-Chair of LIS-Bibliometrics. She is interested in open and equitable research culture. Previously, Robyn worked in the editorial teams of open access and subscription journals.

Unless it states other wise, the content of the Bibliomagician is licensed under a Creative Commons Attribution 4.0 International License.

3 Replies to “Results from the 2021 Responsible Metrics State of the Art Survey”

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from The Bibliomagician

Subscribe now to keep reading and get access to the full archive.

Continue reading