Results from the 2019 Responsible Metrics State of the Art Survey

By Nicolas Robinson Garcia and Lizzie Gadd

Since 2015, the LIS-Bibliometrics Committee has run annual Responsible Metrics State-of-the-Art surveys in which we analyse the penetration of responsible metrics awareness and implementation in universities. Every year, the number of participants as well as the number of countries involved has increased, and already last year reported a total of 115 respondents. This year again, we have seen a notable increase with 218 respondents from 42 different countries with 72% of the respondents being from outside of the UK. This increase is partly due to the fact that alongside the Lis-Bibliometrics, ARMA Metrics (SIG) and INORMS Research Evaluation Working Group lists, it was also advertised on the Spanish speaking lists IWETEL (Spanish mailing list for information professionals) and INCYT (Indicadores de Ciencia y Tecnología (Science and Technology Indicators). This has resulted in a high number of Spanish and South American participants which allows for the first time to show comparisons between the UK and other countries. We plan to look at this in a separate blog post. But first let’s have a general overview of what the respondents indicated about the inclusion of responsible metrics practices in their institutions.





Figure 1. Distribution of respondents by country of origin

DORA signatories

Regarding the adherence of institutions to the DORA Declaration, around 45% of our respondents indicated that their institutions had not yet considered signing DORA. 23% of them had already signed DORA and almost 30% were considering signing it with 12% having decided against it. One indicating the latter, pointed at indifference from university managers, stating that “I pitched it to a leader who told me, don’t hold my breath”. Surprisingly, four respondents indicated they were not familiar with DORA.  When you compare these results to previous years (figure 2), you can see the trajectories from 2017-18 have continued almost perfectly between 2018-2019. 

Which of the following would describe your institution the best?

Figure 2. 2019 responses regarding adherence to DORA
Figure 3. Adoption of DORA over time

Bespoke principles

Roughly half of the respondents (52%) declared that their institutions have, or are considering developing, their own set of research metrics principles but only 16% of them have already developed them (figure 4a and 4b). These figures are very similar to those choosing to sign DORA. Of course, some institutions sign DORA and develop their own principles, so the two may grow alongside each other to an extent.  Interestingly, this year we saw an increase the proportion of institutions considering developing their own principles and deciding against it (figure 5).

In developing their own principles most of the institutions have, or are considering basing them on more than one set of existing statements or principles. The most influential one is DORA, followed closely by the use of other Universities’ principles and the Leiden Manifesto. When you compare this data to previous years (figure 6), you can see that the influence of DORA on the development of bespoke principles has grown, whilst the Leiden Manifesto and Metric Tide’s influence seems to have decreased. However, this may well be due to the broader geographical demographic responding to this year’s survey.

A) Which of the following would describe your institution the best?

Figure 4a. Responses regarding the setup of responsible metrics principles

B) Have you based your principles in any other documentation?

Figure 4b. Responses regarding the influence of other documentation in the development of responsible metric principles
Figure 5. Development of bespoke principles over time
Figure 6. Development of bespoke principles over time

Who’s involved?

We also asked our respondents who was involved in the development of the responsible metrics principles. As observed, there are many different models of institutional organization, indicating a widespread of functions who are involved in research evaluation. While University Research Offices are the most common group involved, many universities include their academic staff in the development of the principles. Libraries and senior university managers also play an important role with some respondents indicating that the setup of the principles was an executive decision made by the university board. Figure 8 shows how the involvement of different parties has changed over time.  The most notable change is the reduced involvement of senior managers over the past few years. This is something of a concern and undoubtedly has an influence on the subsequent impact of the policy. It was interesting to see that the proportion of library staff involved had also reduced, perhaps indicating that this is beginning to be seen as more of a research policy decision than a publication support activity.  It was good to see the proportion of academic staff remaining steady.

Who has been involved in the development?

Figure 7. Functions involved in the development of responsible metrics principles
Figure 8. Functions involved in the development of responsible metrics principles over time

Perceptions, monitoring and reception of responsible metrics principles

In the questionnaire we included three open questions with regards to the changes the introduction of responsible metrics principles has brought to daily practices, if this is being monitored and how it is being received by the scientific community.

Impact of policy on practice
There were 45 usable responses to our question as to how the institution’s policy “has affected their use of metrics in practice”. It was fascinating to see the range of interpretations. At one end of the scale, one institution said they no longer used metrics at all and four had banned certain metrics. Some were using metrics only in line with their policy (5), or with peer review (3). At the other end of the scale, some were developing new metrics (2) and using even more than they originally were (1). Of course, all these responses are entirely legitimate, depending on the institutional context.

Other institutions in response to this question talked about assessing a wider range of outputs now (2), introducing a new academic promotion framework and developing training for those running and interpreting metric-based analyses.  An unexpected, but telling, finding was the reference by three respondents to the fact that their policy enabled them to influence senior managers’ metric requests.  One wrote that their policy helps “mainly in terms of how we treat requests from senior figures for metrics data. We’re able to point to the policy when we tell them that what they want isn’t really in the spirit of responsible use of metrics”.

Figure 9. How has the policy affected the use of metrics in practice? Responses focussed only on metrics use.

Monitoring the policy

There were 39 usable responses to the question asking respondents how they monitored adherence to their responsible metrics policy.  Of these, thirteen said they did not monitor adherence and nine said it was too early to say.  However, an interesting range of approaches were adopted by the other seventeen organisations. The most common approach (5) was to give responsibility for monitoring to some form of group.  This could be a high-level group such as Research Committee, or a dedicated Working Group supported by a dedicated responsible metrics post.  One institution had set up a ‘University-wide Expertise Network in Research Analytics’ where they shared best practice.  Two institutions monitored adherence by limiting the running of analyses to a small controlled group. Two others had a whistle-blowing facility. Three relied on self-policing and a further two ran formal policy reviews and reports.  

Although there is some way to go across the sector in terms of engagement with monitoring activities, this represents a considerable step forward from last year where only five respondents had any detail to offer in response to this question.

How has the policy been received by academics?

A final qualitative question asked respondents how their academic community were responding to the introduction of the responsible metrics policy (figure 10).  Forty-six gave a usable answer to this question.  Whilst nine said it was too early to say, a large proportion (18) said their academics had responded positively with comments such as “very much appreciated” or “quietly positive”.  However, the same proportion (18) said their academic communities were more neutral about it.  Some described the response as “mixed” with comments like:

part of the community does not see anything controversial in the principles, others welcome them, others think they do not go far enough, and yet others believe they may encourage the use of metrics where they are currently not being used.

Another pointed out, “It is complicated, when the national entity of the country values ​​other criteria.” 

Only three spoke of negative responses in their communities, and these were mainly borne of scepticism, “particularly surrounding ‘ideals’ and ‘reality’”.

Figure 10. The response of the academic community to the policy

Conclusions

This year’s survey documents a few interesting shifts in the sector’s approach to responsible metrics.  Firstly, we are seeing a broader range of countries responding the survey which, it is to be hoped, reflects a wider engagement with this important agenda.  We are seeing another huge leap in the number of institutions actively developing or having already adopted a formal responsible metrics approach.  We are still seeing DORA and bespoke approaches growing up together, although the influence of DORA on bespoke approaches seems to be growing.  And we are seeing perhaps a slightly worrying trend towards the reduced involvement of senior managers in the development of responsible metrics principles. This may account for the three institutions reporting for the first time how their policy helps them to upwardly influence senior managers into making more responsible decisions.

I think the most pleasing outcome of this year’s survey is the proportion of respondents having positive stories to tell about the impact of their policies on the use of metrics in their institutions.  Yes, these responses may range from banning metrics to using even more metrics, but the point is that this is being done in a considered and controlled way. The generally positive response from academic staff is also to be welcomed.  However, there are things we need to keep an eye on as a sector.  Relatively few respondents had any stories to tell as to how their policy was to be monitored, which may leave us questioning whether the policy has any teeth.  There is also growing awareness that whilst institutional policies might be a good thing, their influence is limited in a broader national and international environment which doesn’t evaluate responsibly. We have to remember that institutions cannot single-handedly hold off the ‘metric tide’, and if this movement is to be successful, we need engagement from all participants in the research evaluation ‘food chain’.   

Nicolas Robinson-Garcia is a social scientist specialized in bibliometrics and research evaluation. He is currently a Marie Curie fellow at the department of Applied Mathematics (DIAM) at TU Delft under the Leading Fellows programme. He is also member of the Steering Committee of the European Summer School for Scientometrics. He has published over 40 articles and book chapters in the field of bibliometrics and research evaluation on topics such as scientific mobility, altmetrics or social sciences and humanities research assessment among others.

Elizabeth Gadd is the Research Policy Manager (Publications) at Loughborough University. She is the chair of the Lis-Bibliometrics Forum and co-Champions the ARMA Research Evaluation Special Interest Group. She also chairs the INORMS International Research Evaluation Working Group.


 
Unless it states other wise, the content of the 
Bibliomagician is licensed under a 
Creative Commons Attribution 4.0 International License.  

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.