Altmetrics as an Information Evaluation Tool

Rachel Miles, Research Impact Librarian, and Amanda MacDonald, Undergraduate Research Services Librarian, discuss how to use altmetrics as a critical evaluation tool in undergraduate and graduate instruction to trace mentions of research and delve into the context and constructs of the mentions surrounding it. 

Alternative metrics (altmetrics) measure the volume of online attention to research, which can come from a variety of sources, such as mainstream news media, social media, public policy, patents, post-publication peer review platforms, Wikipedia, videos, reference managers, and more [1,2]. Notice that they measure volume, not sentiment, quality, or impact, though sentiment can sometimes be inferred from news headlines and social media posts. As with any research impact indicator, the context surrounding their quantitative measure is crucial to understanding their underlying meaning. 

When paired with appropriate literacy frameworks, such as the ACRL Framework for Information Literacy and the digital literacies from JISC [3,4], instructors can take a synthesized approach to altmetrics that we believe can guide undergraduate and graduate students in their understanding of the complex scholarly ecosystem. 

With information literacy instruction, students are often taught how to evaluate sources of information, such as through the traditional CRAAP test [5]. With altmetrics, and with more advanced skills, bibliometrics, students can evaluate a source that mentions a research output, find the output, find other online sources that mention it, and evaluate the sources for consistency, authority, and context. Taken further, students can evaluate altmetrics themselves to discover the types of outputs that are more likely to receive online attention as well as the types of attention those outputs receive. 

An example of how to trace mentions to a research output, discover other online attention via altmetrics, and then evaluate those sources against one another.
Figure 3. An example of how to trace mentions to a research output, discover other online attention via altmetrics, and then evaluate those sources against one another. Image by Rachel Miles, CC-BY.

Several approaches can be taken when attempting to incorporate altmetrics into information literacy instruction. It’s possible to dive right into altmetrics data via tools and databases, such as Dimensions (freemium), Altmetric Bookmarklet (free), and the subscription Altmetric Explorer or Plum Analytics databases. However, as we began teaching with this concept, we found that exploring attention to research via Altmetric Explorer for Institutions (which we subscribe to) could be overwhelming, especially the first time. As a result, we created metric source cards, printed them out, and found that they were useful for introducing students to the concept. Students are more engaged in the activity and have a set of limited sources and context to explore. Examples of attention to research outputs are given, and some caveats about the attention are also included. Including activity questions with each source card can help guide students through the activity and discussion of their findings. 

For one particularly complex metric source card, Metric Source Card 1, which details the attention and metrics surrounding the research article, “Primary prevention of cardiovascular disease with a Mediterranean diet” [6], students would likely discover that the article has been retracted but that it has been picked up by 198 news outlets. Select news article headlines differ in sentiment, depending on the date of publication. For example, an NPR news headline states: “Spanish Test: Mediterranean Diet Shines in Clinical Study” from 2013 [7], but five years later, another NPR headline states: “Errors Trigger Retraction of Study on Mediterranean Diet’s Heart Benefits” [8]. Several more headlines from 2018 and 2019 reflect the public’s confusion about dietary science and advice. Although there were major errors in the research methodology, the extravagant public attention to the research prompted the editors to invite the researchers to revise their article and resubmit it in 2018; the abstract for the revised article is included on the source card and states that “results were similar after the omission of 1588 participants whose study-group assignments were known or suspected to have departed from protocol.” From a Fortune news article regarding the revised article: “People in the study [ . . . ] had fewer strokes and heart attacks than those who weren’t put on such a diet. However, the study no longer says this applies generally to reducing such risk for this class of people—it’s applicable to participants but not more broadly without additional research” [9]. Covering an example like this in the classroom is dually important, as it allows students to trace the online attention and evaluate the source; it also provides real-life examples related to research ethics and the retraction process, which can elicit active and engaged discussion in the classroom. 

While teaching altmetrics as a tool for evaluating information even in the undergraduate classroom is a relatively novel approach, it is important to note that some academics have been critical of altmetrics. One health scientist espoused that altmetrics should not be used as a filter to find “the best” research; he even used the aforementioned retracted article on the Mediterranean diet as an example (before it was retracted), which at the time had the second highest Altmetric Attention Score [10]. True. No metric should be used in isolation; nor should they be taken at face value. For instance, Andrew Wakefield’s now retracted and fraudulent research article linking the MMR vaccine to autism has 3500 citations on Google Scholar, but most of those citations are negative and reveal the massive amount of time, energy, and resources the academic and medical community has spent debunking his claims [11]. In the classroom, further investigation into the context of the citations, such as through scite, and post-publication peer review, such as via PubPeer, can encourage more in-depth critical appraisal of the academic discussions surrounding an individual research output. 

We do not believe metrics should be used as quality or impact filters, which is why it is so crucial that we teach students, especially graduate students who have ambitions for academic careers, the context, caveats, and limitations of bibliometrics and altmetrics. By understanding their complexity, students can critically evaluate and explore the scholarly conversation using online research tools, such as altmetrics, responsibly and thus engage in informed discussions. 

Finally, altmetrics are still new (almost ten years old), and there are limitations and caveats that should be addressed in the classroom. For example, an article with attention in public policy and Wikipedia is different from an article with excessive Twitter attention. Discussing why certain articles have excessive online attention, such as via the Altmetric Top 100, can also help students understand media hype and trending topics, such as diet fads. In addition, research outputs published more than ten years ago are unlikely to have as much online attention as newer research simply because standard practices for linking to research articles were not yet established. Effective instruction can elicit discussion on these caveats rather than overlooking them. We hope our approach is only the beginning of using research impact indicators in information literacy instruction. 

Acknowledgment

We would like to give a special thanks to Elizabeth Dobbins and Brooke Taxakis who taught us that using source cards to teach information and digital literacy skills in the classroom is a great way to guide and engage students who are learning new skills and techniques related to research. Their presentation at The Innovative Library Classroom (TILC) last year used a similar approach through problem-based learning. Their source cards show context surrounding a specific information source, such as a newspaper article or industry report, and their flipped classroom approach elicits discussion and structured exploration regarding the information sources [12,13]. 

  1. Altmetric. (2015, July 9). Altmetric Sources of Attention. Altmetric. https://www.altmetric.com/about-our-data/our-sources/
  2. Plum Analytics. (n.d.). PlumX Metrics—The Five Categories. Plum Analytics. https://plumanalytics.com/learn/about-metrics/
  3. Association of College and Research Libraries. (2015, February 9). Framework for Information Literacy for Higher Education. ACRL. http://www.ala.org/acrl/standards/ilframework
  4. JISC. (n.d.). Developing digital literacies. JISC. https://www.jisc.ac.uk/guides/developing-digital-literacies
  5. Blakeslee, S. (2004). The CRAAP Test. LOEX Quarterly, 31(3). https://commons.emich.edu/loexquarterly/vol31/iss3/4
  6. Estruch et al. (2013). Primary Prevention of Cardiovascular Disease with a Mediterranean Diet. New England Journal of Medicine, 368(14), 1279–1290. https://doi.org/10.1056/NEJMoa1200303
  7. Hensley, S. (2013, February 25). Spanish Test: Mediterranean Diet Shines in Clinical Study. NPR. https://www.npr.org/sections/health-shots/2013/02/25/172872408/spanish-test-mediterranean-diet-shines-in-clinical-study
  8. McCook, A. (2018, June 13). Errors Trigger Retraction of Study on Mediterranean Diet’s Heart Benefits. NPR. https://www.npr.org/sections/health-shots/2018/06/13/619619302/errors-trigger-retraction-of-study-on-mediterranean-diets-heart-benefits
  9. Fleishman, G. (2018, June 14). Mediterranean Diet Study Walks Back Strongest Claim. Fortune. https://fortune.com/2018/06/14/mediterranean-diet-study-walks-back-strongest-claim-heres-what-researchers-got-wrong/
  10. Colquhoun, D., & Plested, A. (n.d.). Why Altmetrics is bad for science—And healthcare. The BMJ. https://blogs.bmj.com/bmj/2014/05/07/david-colquhoun-and-andrew-plested-why-altmetrics-is-bad-for-science-and-healthcare/
  11. IFLS. (n.d.). Data On 23 Million Children Shows No Link Between Autism and MMR Vaccine. IFLScience. https://www.iflscience.com/health-and-medicine/data-on-23-million-children-shows-no-link-between-autism-and-mmr-vaccine/

Rachel Miles is the Research Impact Librarian at Virginia Tech. She assists researchers and administrators with research assessment, researcher profiles, Open Access efforts, and research communication. Her research focuses primarily on how researchers perceive research assessment and use research impact indicators. She received her MLS from Emporia State University in 2015.

http://orcid.org/0000-0002-8834-4304

Amanda B. MacDonald is the Undergraduate Research Services Librarian at Virginia Tech. Her work and research focuses on creating openly accessible resources to support students and faculty engaging with formal undergraduate research experiences. Prior to VT, she served as the QEP Undergraduate Research Librarian at Louisiana State University and taught composition as an adjunct faculty member at Coastal Carolina University. She earned an MSLS from UNC Chapel Hill (2014) and an MA in English from Auburn University (2010).

Unless it states other wise, the content of the Bibliomagician is licensed under a Creative Commons Attribution 4.0 International License.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from The Bibliomagician

Subscribe now to keep reading and get access to the full archive.

Continue reading