Brace for impact: thoughts on the 2022 THE Impact Ranking

Robyn Price from Imperial College London briefs us on details of the THE Impact Ranking methodology

The fourth edition of The Times Higher Education (THE) Impact Ranking is set to publish in April 2022 and will surely be followed by participating institutions communicating successful performance in it and non-participating institutions might be considering whether to take part in future editions. 

THE’s Impact Ranking is framed around the Sustainable Development Goals (SDGs), a set of 17 goals defined by the United Nations in 2015[1] that relate to complex international development and environmental issues. Whilst not directly addressed in the UN’s SDGs; universities, through their core subfunctions of research, education, operations, governance and external leadership, are essential to the SDG endeavour[2].

THE have translated the UN’s actual SDGs into targets that it can measure universities on[3]. They are a mixture of traditional bibliometric indicators as well as indicators that assess organisational operations and policies. I broadly really agree with this, so for example a university wanting to be recognised for its research on the climate crisis (SDG 13: Climate Action) is also assessed on its carbon footprint, its climate change disaster planning, commitments to carbon neutral operations and more.  Prompting organisations to document their responses to the SDGs, both their research and their values and management policies, is great. However, unpicking the methodology and reflecting on the nature of the ranking I’m left unsure as to whether this is really happening.

Prepare for Impact, image from 9866112 on Pixabay

The primary bibliometric indicators [EG1] 

  1. Number of publications (worth between 7-13.55% to every SDG)

Total number of outputs matching terms relevant to the SDG in a Title, Abstract or Keyword search. It is not clear why the value of this metric to each SDG fluctuates, e.g. research publications on Food Hunger contribute 7% towards the goal, but in Decent Work and Economic Growth research publications contribute 13%.

From the perspective of respecting the value of all forms of research outputs it is positive that this publication count metric is not limited to journal articles.

THE state that it will also include Books, Conference Proceedings, Trade Publications[EG2] , but are these output types indexed comprehensively enough by their source (Scopus) to contribute? The query terms for matching outputs to SDGs are made available[4] but the actual publication sets defined by THE for the rankings are not.

Joint authorship is respected, so an output with co-authors at different institutions will count towards each institution’s score. For hyper-authored papers each institution receives the same credit regardless of weighting. This is the opposite of the THE World Rankings which uses a fractional method to proportion authorship credit.

  • CiteScore (worth between 10%-14% to some SDGs. Absent from some SDGs)

The proportion of publications (as determined by the # Publications as above) that are published in top 10% of all journals in Scopus by CiteScore, normalised by the total number of publications in the same period. CiteScore is an Elsevier equivalent of the Journal Impact Factor[5]. It counts number of citations received to a journal over the proceeding 4 year period and divides this by the number of publications from the journal in the same 4 year period. Some support the CiteScore as a more robust alternative to the IF due to the longer citation window and more inclusive article types and journal indexing, but it is still a journal-based metric and should not be used to assess quality of individual outputs or authors. [EG3] 

  • Field-Weighted Citation Impact (FWCI)(worth 10% to some of the SDGs. Absent from some SDGs)

Number of citations received to an item, normalised by publication type, year of publication, and by subject. The mean of all an institution’s publications for the SDG is found, and then a Cumulative Density Function of a Normal Distribution is calculated. This grants each institution a 0-100 score. FWCI is an unstable metric for small sample sizes and recently-published outputs[6][7] which can result in skew by outliers.

  • Scopus as the bibliometric data source  

Scopus is not a comprehensive database of research outputs.  Studies estimate that it contains approximately 58% of Google Scholar’s content[8], of which roughly 93% is written in English [9]  and predominantly from STEM disciplines[EG4] [10]. This bias will advantage some universities over others. It is also a separate, paid-for database, so only subscribers to Scopus will be able to attempt to replicate or model the bibliometric performance. Are these limitations acceptable for a measure of global development?

The non-bibliometric indicators

The other indicators relate to organisational management and policies that THE has deemed related to each SDG. For example, an indicator for THE’s SDG 16: Peace, Justice and Strong Institutions is ‘academic freedom policy’.

It makes sense to me that academic freedom is something that could be asked of a university wishing to be evaluated under this SDG, however their methodology to award points for the non-bibliometric indicators seems arbitrary and insufficient.

Academic freedom is something that could be asked of a university [ . . . ], however their methodology to award points for the non-bibliometric indicators seems arbitrary and insufficient.

Using the same SDG example, an institution can score up to four points on ‘academic freedom’ by saying they have written a policy (one point), showing THE their policy (up to one point), putting the policy on the internet (one point) and telling THE it has been created or reviewed between 2015-2020 (one point). The evaluation schema for evidence to award “up to one point” is described in the methods as:

Is this a meaningful and contextual appraisal of, in this example, the highly nuanced issue of academic freedom? Stephen Curry has described this perfunctory scoring method as “approximate and incomplete evaluations of a rich spectrum of endeavour”[11]. In addition, THE do not provide [EG5] any indication as to who performs this evaluation, so we’re left to wonder whether assessment is by anything from a panel of experts, a data entry team, or a computer.

The composite score and rank

This method of aggregating scores of different things is a bit ‘pick and mix’. All institutions can submit to as many or as few SDGs as they like.  On one hand, it is sensible that universities only submit in the areas that they have efforts and activities in, and I agree with THE that this approach enables participation from institutions without the resources to return data on all. It also means that institutions can ‘opt-out’ of the SDGs that it does not wish to draw attention to.

Those that submit to at least four SDGs including SDG 17: Partnership for the Goals will receive an overall score and place in the ranking. This is composite aggregation of scores from separate and different SDGs into one overall score, and a fundamental weakness.

Richard Holmes writes, “The University of Sydney… is ranked for clean water and sanitation, sustainable cities and communities, and life on land… (this) includes supporting water conservation off campus and the reuse of water across the university. RMIT University, in third place, is ranked for decent work and economic growth, industry innovation and infrastructure and reduced inequalities… So, essentially THE is trying to figure out whether Sydney is better at reusing water than RMIT is at announcing policies that are supposed to reduce discrimination”[12].

The methodological weakness of the composite score has been commented on since the first 2019 instalment of the ranking[13] yet THE have chosen to retain it in each edition, to be able to dish out awards for the most impactful institutions[14]. The essential competitive nature of a ranking system doesn’t exactly lend to ‘Partnership for the Goals’ either.

Access to evidence and data

None of the submitted data or evaluation evidence is made available. Subscribers to the paid-for THE DataPoints analytics product can see a very limited summary of the scores to benchmark between institutions[15]. This lack of access to supporting data would not be considered acceptable research practice within many of the universities submitting to this ranking.

Concluding thoughts

I recognise that this ranking gives institutions a framework through which to communicate their achievements on really important issues, but it is a disservice to the genuine intentions and investments that many of the participating universities are making that a flawed and competitive ranking has commoditised it.

As an interesting aside, THE’s owner, Inflexion Private Equity Partners LLP, detail their environmental, social and governance management[16], but the THE itself does not have any publicly-available information on its own contributions towards the SDGs. This must make us question whether this puts them in the best position to judge others?

I hope that THE’s new Impact Rankings Advisory Board will contribute to meaningful methodological improvement.

I do observe some improved rigour [ . . . ], but essential issues of lack of access to evidenced data and flawed composite score remain unchanged since the first edition.

From examining the 2021 and 2022 methodology I do observe some improved rigour in the indicators of the 2022 issue, but the essential issues of lack of access to evidenced data and flawed composite score remain unchanged since the first edition.

As the number of institutions participating increases every year, more than doubling between the 450 entered in 2019 to 1,200 in 2021 and allegedly more than 1,500 institutions have entered for the 2022 issue[17]; THE become more and more successful in conflating performance in their Impact ranking with actual impact in the world.


[1] https://sdgs.un.org/goals

[2] https://resources.unsdsn.org/getting-started-with-the-sdgs-in-universities

[3] https://the-impact-report.s3.eu-west-1.amazonaws.com/Impact+2022/THE.ImpactRankings.METHODOLOGY.2022_v1.3.pdf

[4] https://data.mendeley.com/datasets/87txkw7khs/1

[5] https://en.wikipedia.org/wiki/CiteScore

[6] https://thebibliomagician.wordpress.com/2017/05/11/scivals-field-weighted-citation-impact-sample-size-matters-2/

[7] https://www.elsevier.com/research-intelligence/resource-library/research-metrics-guidebook

[8] Martín-Martín, A., Thelwall, M., Orduna-Malea, E. et al. Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations. Scientometrics 126, 871–906 (2021). https://doi.org/10.1007/s11192-020-03690-4

[9] Vera-Baceta, MA., Thelwall, M. & Kousha, K. Web of Science and Scopus language coverage. Scientometrics 121, 1803–1813 (2019). https://doi.org/10.1007/s11192-019-03264-z

[10] Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics, 106(1), 213–228. https://doi.org/10.1007/s11192-015-1765-5

[11] http://occamstypewriter.org/scurry/2020/04/26/still-unsustainable-university-rankings/

[12] https://rankingwatch.blogspot.com/2021/08/thes-caucus-ranking.html  

[13] https://occamstypewriter.org/scurry/2019/05/20/unsustainable-goal-university-ranking/

[14] https://www.manchester.ac.uk/discover/news/manchester-named-worlds-best-university-for-action-on-sustainable-development/

[15] https://www.timeshighereducation.com/our-solutions/data-and-insights/sdg-impact-dashboard

[16] https://www.inflexion.com/responsible-investing/environmental-social-and-governance/

[17] https://www.timeshighereducation.com/world-university-rankings/more-1500-universities-submit-data-impact-ranking

Robyn Price established a responsible bibliometric analysis and education service at Imperial College London. She is also interested in open and equitable research models. Previously, Robyn worked in the editorial teams of open access and subscription journals. 

Unless it states other wise, the content of the Bibliomagician is licensed under a Creative Commons Attribution 4.0 International License.


Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from The Bibliomagician

Subscribe now to keep reading and get access to the full archive.

Continue reading