Reflections on the 28th Nordic Workshop on Bibliometrics

Jeffrey Demaine a Canadian Bibliomatrician reflects on the 28th Nordic Workshop on Bibliometrics and Research Policy in Gothenburg, Sweden, providing a thoughtful summary of the state of the field in the EU.

In October 2023, I attended the 28th Nordic Workshop on Bibliometrics and Research Policy in Gothenburg, Sweden. Hosted by Chalmers University of Technology only a short tram ride from the city center, librarians and bibliometricians from across Scandinavia (and a few other countries) gathered to share the results of their work. As the lone Canadian, I justified my attendance by underlining how our snowy climates and love of ice hockey bring us together. The field of bibliometrics is so small that there are really only a few options to get together with like-minded colleagues. So I was pleased to be able to meet some of the people whose work I have read over the years, and to be part of the larger community. While it is not possible to summarize all of the presentations, I will highlight a few of the talks that caught my imagination.

Further details are available at:

A group of people standing in a auditorium

Description automatically generated

UK’s Research Excellence Framework (REF)

Mike Thelwall presented a fascinating machine-learning approach to replicating the UK’s Research Excellence Framework (REF). The REF occurs every seven years and is a national peer-review exercise to rank (and thereby allocate funding to) universities. Thousands of publications are peer-reviewed, and the opinion of the REF evaluators is rolled up into a score ranging from one star (“recognised nationally”) to four stars (“world-leading”). This manual process is fantastically expensive, with the 2014 REF estimated to have cost 250 million pounds (Else, July 13, 2015).

…we know that peer-review process is plagued with bias and vagueness.

In any case, studies have shown that it is fairly straightforward to approximate the REF’s ranking by using bibliometrics, and Mike Thelwall’s machine-learning approach produced encouraging results. While the algorithm performed better in some subjects than others, it demonstrated that one could at least approximate the REF rankings for very little cost. However, because the evaluators assumed that their peer-review process had resulted in a perfect ranking of institutions, any deviation by the algorithmic approach was judged to be a failure. Instead, as we know that peer-review process is plagued with bias and vagueness, it is just as likely that the machine-learning ranking is closer to the truth. Either way, it would have been best to define the expectations and assumptions of the evaluators before they judged the results.

Adoption of Coalition for Advancing Research Assessment (COARA)

There was a panel discussion about what research assessment will look like without the “Nordic (publication) indicator”. It seems that universities were financially rewarded for publishing in journals contained in a ‘white list’ of high-impact journals as maintained by their country’s respective science-funding agency (so there seems to have been five similar indicators; one from each Nordic country).

While bibliometricians applaud the emphasis placed on the responsible use of metrics, at the same time they are skeptical about the reliance on peer-review in the context of CoARA.

The point is simply that while CoARA has been widely adopted across the Nordics, there is no real idea on how to proceed. While bibliometricians applaud the emphasis placed on the responsible use of metrics, at the same time they are skeptical about the reliance on peer-review in the context of CoARA. Is it realistic or even responsible to do away with metrics? Without the Nordic indicator, are we back to chasing the ephemeral “research excellence”? In any case, the utility of journal rankings may be coming to an end. Bjorn Hammerfelt (University of Boras) made the insightful observation what constitutes an academic journal has become quite fluid, in that an article can now be associated with a pre-print, open review, and a dataset. This makes it difficult to define where a journal begins and ends, and undermines the utility of any white list.

Some notable items on this issue:

  • Denmark decided to stop using the Nordic indicator in 2022. There is no replacement lined up. 
  • Finland uses its “JUFO” system to evaluate publications. By 2021, 14% of university funding is based on publication metrics.
  • Iceland just introduced an incentive system based on Scopus. The Nordic indicator has been used for internal distribution of funds between departments, but not for government funding of universities.
  • Norway will stop using the Nordic indicator in 2025, but record-keeping of publication output will continue. There are concerns over the quality of reporting post-2025 if there is no incentive for researchers to record their publications.

The Best Presentation – Sleeping Beauties

Undoubtedly the best presentation of the conference was mine (just a hint of bias there…). Using the vintage HistCite software developed by Eugene Garfield, I uncovered the reason behind the recent spike in citations of Kessler’s 1963 paper on Bibliographic Coupling. As bibliometricians, we take this fundamental technique for granted. But in 2015 it was introduced to the field of “Marketing, Commerce, Tourism and Services”, and it has since been cited by hundreds of bibliometric studies of the marketing field. As a result, bibliographic coupling became a Sleeping Beauty after 2017.

Always a Great Keynote with Cassidy Sugimoto

The conference wrapped up on Friday, October 13th with a keynote by Cassidy Sugimoto. Her whirlwind talk covered the global structure of science, looking at such features as collaboration between countries, the mobility of researchers, and a comparison of national funders. We will certainly hear much more from her team in the near future. 

Wrap-up

Let me end with a note about the location. Gothenburg is a lovely city with a relatively dense old city center that is encircled by a canal and parks. Just as charming was the network of white-and-blue trams running in all directions. For 2024, the Nordic Workshop on Bibliometrics will be hosted by Iceland. I hope to attend that event and to see my European colleagues again. If you feel Iceland is too far and/or volcanic, the North American Bibliometrics and Research Impact Community Conference 2024 (BRIC conference; http://www.bric.ca) will be held in Vancouver, B.C. from June 4th to 6th. Perhaps some Nordic friends will be there too.

About the Author

Jeff Demaine is a bibliometrician located in Ottawa, Canada. By combining an interest in scientometrics with coding skills, he specializes in finding patterns in metadata. His recent publications cover topics such as Sleeping Beauties, global trends in collaboration, and gender balance. He has worked for the Universities of Ottawa, Waterloo, and McMaster. His very first bibliometric study used the Science Citation Index in the original printed(!) format. He is the co-organizer of the annual BRIC conference (bric-conference.ca), which will soon be held in Vancouver, B.C.. His motto is “Narratives, not numbers!“.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from The Bibliomagician

Subscribe now to keep reading and get access to the full archive.

Continue reading