Journal Metrics in Context: How Should a Bibliomagician Evaluate a Journal? Guest post by Massimo Guinta

In the world of academia and research, “publish or perish” has become more complicated than ever. It’s not enough to merely publish, one has to publish in a high-impact journal, in the hopes of getting noticed, and more importantly, perhaps, getting funded for further research.

CC BY david_17in

Institutions are urging their researchers to publish in high-impact journals. Library collections are on tight budgets, so librarians want only the best journals for their collections. Emphasis on impact and quality has given rise to a whole new realm of metrics by which to measure a journal. But which metric is best? What’s the magic bullet to definitively name a journal as The Best?

One of the most well-known journal metrics is the Journal Impact Factor (JIF). It seems like the JIF has invaded every aspect of the academic researcher’s world, but did you know it was developed for a very specific use?

JIF is defined as “a ratio of citations to a journal in a given year to the citable items in the prior two years.” It was intended as a simple measure for librarians evaluating the journals in their collections. In fact, the entirety of the Journal Citation Reports (JCR) was developed for this purpose in the 1970s. Over the years, its utility to other markets has emerged – most importantly to publishers and editors of journals. It has also been misused to evaluate researchers, but Clarivate Analytics, formerly the IP & Science business of Thomson Reuters, has always been quite clear that JCR data, and the JIF in particular, should not be used as proxy measures for individual papers or people.

So is JIF the be-all and end-all of journal evaluation? No. The truth is, there is no one metric that can be used to name the best journals. Why not? “Best” is subjective, and so are metrics.

Sticking with the JIF for now, anyone seeking to evaluate a journal’s place in the research world should not simply look at its JIF; that number, on its own with no context, has limited meaning. Even in context, the JIF is just one number; the JCR contains an entire suite of metrics for journal evaluation, and other parties also offer journal evaluation metrics, such as the SCImago Journal Rank, or the Eigenfactor metrics, which are produced by Clarivate Analytics in partnership with the University of Washington.

Both Eigenfactor and Normalized Eigenfactor scores look at the data in a different way than the JIF does—they look at the total importance of a scientific journal in the context of the entire body of journals in the JCR. While JIF uses two years of data and is limited to the field in which a journal is classified, Eigenfactor scores look at the entire corpus of journals and five years of data. A journal could be ranked lower by its JIF than by its Eigenfactor (or Normalized Eigenfactor).

So which is better: Journal A with a higher JIF or Journal B with a higher Eigenfactor? Looking at just these two metrics will not answer the question. Perhaps Journal B also has a higher Article Influence Score—a score greater than 1 shows that a journal’s articles tend to have an above-average influence. Perhaps Journal A also has a higher Percent Articles in Citable Items, meaning it tends to publish more original research than reviews. Looking outside the JCR, perhaps Journal A has had a higher citation count in the past year, whereas Journal B skews more favorably looking at Altmetrics like page views or social media mentions.

Therefore, any statements about a journal’s impact need to include context. When you evaluate a journal, you should look at all of its metrics for the most complete picture, and this picture will vary by field and year.

Bottom line? While there is no magic bullet to determine the best journals, with the wealth of journal metrics out there, and whatever might come down the pipeline in the future, evaluating journals in context is not as difficult as you might think!


Further Reading:

  1. Best Practices in Journal Evaluation
  2. All About the Eigenfactor
  3. JCR Blog Series
  4. JCR product information


Massimo Giunta is Account Manager UK & Ireland for Clarivate Analytics


What are you doing today?

The Lis-Bibliometrics commissioned, Elsevier sponsored bibliometric competencies research project is seeking to develop a community-supported set of bibliometric competencies, particularly for those working in libraries as well as in other related services.  You can take part by completing the bibliometrics competencies survey at:

To get a flavour of the variety of bibliometric work going on, I asked fellow Lis-Bibliometric Committee members what they’re doing today:

“Today I’m helping a researcher clean up his very muddled and duplicated Scopus Author IDs and link his outputs to his ORCID iD. I’m also thinking about how best to benchmark the output of our Law school against our competitors for undergraduate students.” Karen Rowlett, Research Publications Adviser, University of Reading

“Today I’m discussing the release of our Responsible Metrics Statement (now approved by Senate J) with our PVCR; running some analyses on SciVal which look at the impact of Loughborough’s conference publications on our overall citation performance; and presenting at a cross-university meeting aimed at exploring how to improve the visibility of our research.” Elizabeth Gadd, Research Policy Manager (Publications), Loughborough University

“Today I am preparing a presentation on Metrics for one of the teams, and working on analysing the Leiden Ranking data.” Sahar Abuelbashar, Research Metrics Analyst, University of Sussex

Meanwhile, I’m advising researchers on using citation metrics in grant applications.  What are you doing today?

Katie Evans

Research Analytics Librarian, University of Bath

Job Opportunity: Bibliometric Analyst at Chalmers University of Technology

Chalmers University of Technology in Sweden is looking for a bibliometric analyst.

Chalmers, a highly progressive technological university, is situated on the west coast of Sweden in beautiful Gothenburg.

Information about the research/the project/the department
Chalmers Library is expanding its work with bibliometrics and university ranking analytics. The objective is to support the university management, departments and areas of advance with strategic advice for development. Currently there is a team devoted to this and they report to the Library Director. The work consists of analytics, developing strategies and report on publishing, impact and university ranking in close collaboration with researchers and relevant units within the university. By being active in national and international networks we collaborate with other universities.

Job description
Do you want to contribute to Chalmers development and research impact? If so, an exciting and stimulating position awaits! Your work will consist of studying and analysing Chalmers publications, impact and research collaboration. Data-points include the local system for research information, Web of Science, Scopus and SciVal. You are also expected to contribute to our university ranking analytics. Based upon your results and experience you will discuss development strategies with university management,  departments and areas of advance daily Participating in relevant networks and monitoring the development of methodologies is expected.

Read more and apply:

Application deadline: 17 January 2017

For questions, please contact:
Daniel Forsman, Chalmers bibliotek,, 031 7723751


Job Opportunity: Bibliometrician and Data Analyst at University of Leicester

Leicester is recruiting a permanent, full-time bibliometrician and explains the role thus:

You will provide high-level expertise, and advice to the University on the use of bibliometrics and related indicators in support of the University’s research objectives and to ensure that the University maintains a comprehensive record of its research outputs and that these are correctly indexed by the major citation services.

You will also ensure that the Library anticipates and meets the needs of the University Research community through user engagement activity with researchers, colleagues in the Research Enterprise Division and the Library to identify opportunities for, and new approaches to, using bibliometrics and analytics to support research, help maximise the quality and impact of research outputs and enabling researchers and professional staff to use altmetrics and other tools for their own analysis.

In addition, the role will require expertise in learning and library analytics and through the exploitation of bibliometrics, data analytics and the application of best research practice to inform Library strategic planning.

Click here to see the Job Summary for this position.

Informal enquiries are welcome and should be made to

The Bibliomagician seeks honorary Blog Manager

The Bibliomagician blog seeks to provide comment and guidance for practitioners engaging with bibliometrics.  We now are seeking an enthusiastic volunteer to take The Bibliomagician Blog on to the next level!  The post-holder would sit on the ‘light-touch’ Lis-Bibliometrics Committee and be responsible for:

  • Soliciting relevant content for the Blog
  • Reviewing contributions
  • Scheduling blog posts
  • Managing the resources pages
  • Reporting back to the Committee

This would be a great opportunity for someone seeking to make their mark on the world of bibliometrics, whilst developing their bibliometric knowledge, social media and editorial skills within a friendly and supportive environment.  For more information or to express an interest in this role, please get in touch with Elizabeth Gadd Chair of the Lis-Bibliometrics Forum.

Job Opportunity: Research Information Analyst & Open Access Officer, LSE

The London School of Economics and Political Science (LSE) is currently advertising for a Research Information Analyst & Open Access Officer in the Research Support Services team, LSE Library:

  • Salary from £33,784 to £40,867 pa inclusive (pro-rata)
  • This is a fixed term appointment for 11 months (maternity cover) and is part time for 21 hours per week
  • Closing date 17 October 2016

 The Library is at the heart of LSE, one of the world’s greatest social science universities, and serves a vibrant community of students and staff in the centre of London. The Library has a high reputation both nationally and internationally for its extensive collections, its involvement in innovative projects and its high quality services.

An exciting opportunity has arisen to join our Research Support Services team to provide bibliometric initiatives in the Library and oversee the LSE Institutional Publication Fund. You will lead on bibliometrics activities by providing expertise on citation analysis and conducting bibliometric reports to support the evaluation of research. You will maintain an in-depth knowledge of this area, providing advice on the use of a range of traditional and emerging publication metrics such as citation and altmetrics.

In this post you will also be responsible for administering the LSE Institutional Publication Fund and overseeing the payment of Article Processing Charges (APCs) for paid Open Access papers. This involves promoting the fund to researchers and managing workflows between authors, publishers and funders. This role will involve close liaison with the repository manager, academics, the Research Division and other groups in the School.

You will be a graduate with a CILIP-recognised professional qualification in librarianship or information science with post qualification experience of working in a library or similar customer service environment, preferably in higher education. You should be an excellent communicator with strong organisational skills. You will be able to initiate service developments as well as having the ability to work as part of a team. Experience of working with bibliometrics or supporting research would be an advantage.

 We offer an occupational pension scheme, generous annual leave and excellent training and development opportunities.

For further information about the post, please see the job description and the person specification.

To apply for this post, please go to If you have any queries about applying on the online system, or require an alternative format for the application, please e-mail:

The closing date for receipt of applications is 17th October (23.59 UK time). Regrettably, we are unable to accept any late applications.

Informal enquiries can be made to Nancy Graham at

UK universities respond to the Metric Tide’s call to do metrics responsibly

In September 2015 I ran a short survey to see how institutions were responding to the Metric Tide report‘s call to take a responsible approach to metrics and to consider signing the San Francisco Declaration on Research Assessment (DORA). Only three of the survey respondents had signed DORA (although 5 were considering it), but nine respondents were thinking about developing their own set of principles for the responsible use of metrics. In the year that followed there was very little movement on signing DORA. However, support for one of the key elements of DORA – a backlash against the Journal Impact Factor – has grown, and with it further condemnation for HEIs who have not signed. However, as indicated in an earlier blog post not every institution that has failed to sign DORA has failed to do anything. With an ear to the ground via the Lis-Bibliometrics and ARMA Metric SIG lists, I was aware of a growing movement towards the development of in-house Statements of Responsible metrics. So, I put together a second survey to capture some of this activity.

measuring-tapeCredit: Laineys Repertoire CC-BY

The second survey had slightly more respondents (26 versus the original 22) but the low response rate still indicates that this is a fledgling area. This was further confirmed by the fact that none of the respondents were yet at a stage where they could say they had completed their own set of published principles (although one said they’d agreed some principles internally and were not intending to publish). However, compared to a year ago where just nine were thinking of developing their own principles, this year seven were now at a stage where they were actively developing these and a further five were thinking about it. Only two had considered developing their own principles and rejected the idea compared to five who had considered the idea of signing DORA and rejected it last year.

Of the 13 who were developing their own set of principles (or had considered this internally), no-one said they were basing these on DORA. Instead, over half (8) said they were using the Leiden Manifesto as the basis, and two indicated they were basing their work on the principles of another University. One respondent said the Metric Tide Responsible Metrics Framework was informing their thinking. Similar to last year the development of principles were being guided by a range of staff, most of which were from the Research Office, Senior Managers and Library, although in different proportions.

So, the sector is stirring, and The Bibliomagician blog is starting to document all the Statements of Responsible Metrics they can find. Other moves are also afoot to help universities to do metrics responsibly and well. The Association for Research Managers and Administrators (ARMA) is offering a course to its members on the Responsible Use of Metrics in May 2017, and the Lis-Bibliometrics Forum is planning an event on this theme in September 2017. The Lis-Bibliometrics Forum are also facilitating some Elsevier-sponsored work on developing a set of bibliometric competencies for use by practitioners to ensure the staff supporting metrics in their institutions have the skills and knowledge they need. This should report in January 2017 and a workshop is planned at the UKSG conference in April. Of course the biggest news in this area is the recent launch of the Responsible Metrics Forum by a group of research funders, sector bodies and infrastructure experts.  We’re on our way! And 2017 should see an even greater set of resources available to practitioners wanting to establish responsible metrics principles and practices at their institution.