Responsible metrics: where it’s at?

Lizzie Gadd reflects on two recent responsible metrics events: Lis-Bibliometrics’ “Responsible Bibliometrics in Practice” and the HEFCE/UUK’s “Turning the Tide: Changing the Culture of Responsible Metrics” both held within ten days of each other in London.

Responsible metrics events are like buses it would seem.  Nothing for nine months, and then three within the space of ten days.  I managed to get to two of them (alas, the HESPA responsible metrics workshop clashed – fancy, two buses on the same day).  So the first I attended was on 30 January, where played host to the Lis-Bibliometrics Responsible Bibliometrics in Practice event.  The second, on 8 February, was UUK and HEFCE’s Turning the Tide: Changing the Culture of Responsible Metrics.  David Price, outgoing Chair of the Forum for Responsible Metrics provided the keynote at the former, and David Sweeney opened up the latter with the highlight of the show being Paul Ayris delivering the findings of the Forum for Responsible Metrics’ survey as to how the UK HE sector was engaging with this agenda.

Abacus Jacob Kidder CC-BY-NC-SA
Abacus – Scott Kidder. CC-BY-NC-SA

It’s not my purpose to provide a blow-by-blow account of who said what, as that would be dull for both of us.  However, I did want to offer my reflections as to where I think we’re at and where, in my unsolicited opinion, we might go next.

1 UK HE institutions are at an early stage with this – still.

If the Forum’s survey showed us anything, it was that UK HE is largely yet to engage with frameworks for the responsible use of metrics.  I don’t want to steal their thunder as they plan to publish the results of their survey on their website.  However 75 of the 96 HE respondents (78%) said they did not have a research metrics policy, and in the Forum’s opinion only four respondents had undertaken a “comprehensive set of actions” towards implementing responsible research metrics in their institutions.  The Lis-Bibliometrics event showed a similar trend.  When asked to cluster into ‘Birds of a Feather’ groups, those who were just ‘packing their bags’ to embark on a responsible metrics journey formed by far the largest group.  Having said all that, there is clearly a huge amount of interest in this.  Both events were fully booked with waiting lists, and I understand the Turning The Tide event sold out in 24 hours.

2 When people talk about responsible metrics, they are mainly talking about bibliometrics.

The Lis-Bibliometrics event was, unsurprisingly, focussed on responsible bibliometrics.  So I was keen to see how conversations would differ when discussing the broader theme of responsible research metrics at the UUK/HEFCE event.  Actually, the conversations hardly differed at all – particularly when the early career researchers took the stage.  There was very little discussion about responsible grant income targets or TEF or NSS.  The main non-biblio metrics that got an airing were the world university rankings, but of course publications and citation metrics play a considerable part in those.  This was particularly interesting to me as Loughborough deliberately focused on responsible bibliometrics, building on the Leiden Manifesto, when developing their policy. Other institutions, such as Bath, opted to go broader, in line with The Metric Tide’s approach which covered all forms of research evaluation.  However, all three frameworks were somewhat conflated in discussions.  It would seem that to most people it is the use of bibliometrics for research evaluation which can cause the most damage.  However, as Evelyn Welch pointed out, it’s not just bibliometrics, or even research metrics that can lead to poor outcomes, teaching and knowledge exchange metrics can be just as bad.

3 The indefatigable rise of DORA.

The day before the Turning the Tide event the RCUK announced that they had become the latest in a long line of increasingly high-profile signatories to DORA. On the day itself David Sweeney confirmed that the UKRI wouldn’t be far behind.  The survey data showed that 31 other institutions were considering signing DORA although 12 had considered it and decided not to proceed. I don’t think the survey asked whether HEIs were considering alternative approaches such as Leiden or the Metric Tide although they were asked if they agreed with their principles (answer: broadly yes).  As someone who has long expressed concerns about the comprehensiveness of DORA (it aims just two principles at institutions, Leiden has ten), I must confess to being somewhat baffled by this.  To my mind DORA lacks the depth of the Leiden Manifesto and the breadth of the Metric Tide.  Penny Andrews, PhD student at the University of Sheffield, expressed concern about DORA’s science-based roots (developed by a group of cell biologists with an inevitable focus on journal metrics) and challenged its relevance to non-STEM disciplines.

I can only really put the rise of DORA down to high-profile protagonists and investment (DORA has its own marketing person, Leiden doesn’t).  It’s not bad.  It’s WAY better than nothing.  And I think most signatories are seeing it as a high-level commitment to a direction of travel on responsible metrics rather than a detailed itinerary.  (Indeed, in a recent tweet, Stephen Curry suggested HEIs sign first and think later!). However, to my mind it will never be the best product on the market, and that’s that.  But I’m aware that I might be starting to sound like a lonely proponent of the Betamax, in danger of ending my days rocking on an office chair, sobbing “but it was a much better solution…”

4 It is not just HEIs that need to engage with this.  Funders, rankers, and suppliers all need to come under scrutiny.

OK – hands up, this was my point.  But it certainly got some airtime in both actual and virtual discussions.  Maybe it’s because I work for an institution that I feel sensitised to the finger-wagging the HE sector has been subject to for not engaging with this agenda, not signing DORA, and not publishing their promotion criteria.  However, universities only measure because they are measured.  If funders are using odd metrics or – worse – world ranking performance to allocate funding or studentships, universities can end up doing odd things.  Similarly, bibliometric tools can be very easy to use irresponsibly and this is outside our jurisdiction. At the Lis-Bibliometrics event, Altmetric Founder and Director, Euan Adie, talked about two supplier “cop-outs” in this space, one of which was “We’ll just make data available and let the community decide what to do with it”. They won’t, he said. Suppliers need to ensure their products are in line with responsible metrics principles and to be transparent about their limitations.  There was a theme at both events about the importance of metrics and service suppliers co-designing with end-users.  Lis-Bibliometrics plan to start gathering some feedback on this from the community in the near future.

5 We need to measure what we value

Related to the previous point, both events quickly alighted on the importance of measuring what we value, and the fact that not everything we value can currently be measured.  In terms of research impact, citations are not the only fruit.  Members of the early career panel at Turning the Tide were quick to point out that their ‘best’ work was not in the ‘best’ journals, and often academic outputs had an impact on industry that would never result in citations. Penny Andrews gave an example of a piece she had recently written for the publication Prospect with a circulation of thousands which was tweeted by Harriet Harman – but not valued by senior figures in her institution in the same way as an article in a high-impact journal.  At the Lis-Bibliometrics event, Katie Evans raised the important question as to how we can encourage openness in early-career colleagues when they face such pressures to publish in usually closed ‘high impact’ journals.  David Price said that he felt senior colleagues had to lead the way.  At UCL, Paul Ayris pointed out, promotion criteria now included openness metrics.  The challenges of measuring openness, and open measures were acknowledged.  Interestingly enough, Lis-Bibliometrics plans to take a look at this in more detail at a future event.

6 The increasing importance of responsible peer review

All three of the existing frameworks for responsible metrics talk about the need for metrics to support, not supplant, expert peer assessment – and it’s hard to argue with that.  However, this gives rise to two questions: 1) what should the balance between metrics and peer review actually be?  And 2) just how responsible is peer review anyway? We recently held a publication strategy workshop at Loughborough University for probationary academics many of who were new to the UK.  When we explained the REF’s peer review approach there were literally peals of laughter at the thought that it could really deliver expert peer review in the time available on the range of disciplines it covered.  ‘Give us metrics!’, one cried, on the basis that at least they would know what they were dealing with.  When we asked whether any of them had ever had any guidance as journal peer reviewers, very few raised their hand. Not surprisingly therefore, we have situations like Daniel Graziotin’s where a single journal peer review outcome ranged from reject, through major corrections to minor corrections.  If we’re relying on peer review as the ‘gold standard’ of research evaluation to which metrics must bow the knee, I think it should be subject to the same level of responsible use scrutiny as metrics.

At the Lis-Bibliometrics event, David Price talked about the challenge REF panels faced in dealing with unconscious bias including the unconscious influence of journal reputation, and suggested that perhaps they should just be supplied with the DOI rather than the bibliographic reference.  An excellent idea, I think. As a result of the REF Open Access policy we have thousands of branding-free Author-Accepted Manuscripts sitting in Institutional Repositories – why don’t we make use of them?

7 It is very important to get this right.

I’m a big fan of this point.  The Royal Society’s Adam Wright reminded us very clearly of the mental health challenges faced by academics.  He cited a study at the University of Leeds in which 26% of PhD students reported mental health concerns in year 1 of their PhD, a figure which rose to 48% by year 3.  Evelyn Welch talked about the prestige economy in which academics sit, and how they are constantly evaluating themselves and each other both explicitly and implicitly. We are all acutely aware of the workload pressures, and the four-fold burden (research, teaching, impact and admin) that academics bear.  In this highly charged, highly pressured environment, the consequences of getting metrics wrong can, literally, be fatal.

8 People are the key to getting it right.

My final point at both the Lis-Bibliometrics and the HEFCE/UUK event was, in a nutshell, that metrics can’t be responsible, only people can.  This sentiment clearly resonated as it ended up the most retweeted tweet after Cameron Neylon picked it up.  To use the Metric Tide’s five principles: we need robust, humble, diverse, transparent and reflexive people doing metrics.  And the owner of any responsible metrics policy (and they do need an owner) should be the most responsible of them all.  As I said at the Lis-Bibliometrics event, an organisation can only be as responsible as its most senior decision maker in this space. I have another blog post brewing on this so I won’t linger on the topic here, but suffice to say, responsible metrics is as important as it is hard.  This needs to be the responsibility of well-informed, well-connected, wise people – in universities, funders, ranking organisations, and suppliers – who really care about the precious lives that sit behind the numbers they are churning on their spreadsheets.

Elizabeth Gadd

Elizabeth Gadd is the Research Policy Manager (Publications) at Loughborough University. She has a background in Libraries and Scholarly Communication research. She is the co-founder of the Lis-Bibliometrics Forum and is the ARMA Metrics Special Interest Group Champion




Creative Commons LicenceUnless it states otherwise, the content of The Bibliomagician is licensed under a Creative Commons Attribution 4.0 International License.


One Reply to “Responsible metrics: where it’s at?”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: