The Bibliomagician seeks honorary Blog Manager

The Bibliomagician blog seeks to provide comment and guidance for practitioners engaging with bibliometrics.  We now are seeking an enthusiastic volunteer to take The Bibliomagician Blog on to the next level!  The post-holder would sit on the ‘light-touch’ Lis-Bibliometrics Committee and be responsible for:

  • Soliciting relevant content for the Blog
  • Reviewing contributions
  • Scheduling blog posts
  • Managing the resources pages
  • Reporting back to the Committee

This would be a great opportunity for someone seeking to make their mark on the world of bibliometrics, whilst developing their bibliometric knowledge, social media and editorial skills within a friendly and supportive environment.  For more information or to express an interest in this role, please get in touch with Elizabeth Gadd Chair of the Lis-Bibliometrics Forum.

UK universities respond to the Metric Tide’s call to do metrics responsibly

In September 2015 I ran a short survey to see how institutions were responding to the Metric Tide report‘s call to take a responsible approach to metrics and to consider signing the San Francisco Declaration on Research Assessment (DORA). Only three of the survey respondents had signed DORA (although 5 were considering it), but nine respondents were thinking about developing their own set of principles for the responsible use of metrics. In the year that followed there was very little movement on signing DORA. However, support for one of the key elements of DORA – a backlash against the Journal Impact Factor – has grown, and with it further condemnation for HEIs who have not signed. However, as indicated in an earlier blog post not every institution that has failed to sign DORA has failed to do anything. With an ear to the ground via the Lis-Bibliometrics and ARMA Metric SIG lists, I was aware of a growing movement towards the development of in-house Statements of Responsible metrics. So, I put together a second survey to capture some of this activity.

measuring-tapeCredit: Laineys Repertoire CC-BY

The second survey had slightly more respondents (26 versus the original 22) but the low response rate still indicates that this is a fledgling area. This was further confirmed by the fact that none of the respondents were yet at a stage where they could say they had completed their own set of published principles (although one said they’d agreed some principles internally and were not intending to publish). However, compared to a year ago where just nine were thinking of developing their own principles, this year seven were now at a stage where they were actively developing these and a further five were thinking about it. Only two had considered developing their own principles and rejected the idea compared to five who had considered the idea of signing DORA and rejected it last year.

Of the 13 who were developing their own set of principles (or had considered this internally), no-one said they were basing these on DORA. Instead, over half (8) said they were using the Leiden Manifesto as the basis, and two indicated they were basing their work on the principles of another University. One respondent said the Metric Tide Responsible Metrics Framework was informing their thinking. Similar to last year the development of principles were being guided by a range of staff, most of which were from the Research Office, Senior Managers and Library, although in different proportions.

So, the sector is stirring, and The Bibliomagician blog is starting to document all the Statements of Responsible Metrics they can find. Other moves are also afoot to help universities to do metrics responsibly and well. The Association for Research Managers and Administrators (ARMA) is offering a course to its members on the Responsible Use of Metrics in May 2017, and the Lis-Bibliometrics Forum is planning an event on this theme in September 2017. The Lis-Bibliometrics Forum are also facilitating some Elsevier-sponsored work on developing a set of bibliometric competencies for use by practitioners to ensure the staff supporting metrics in their institutions have the skills and knowledge they need. This should report in January 2017 and a workshop is planned at the UKSG conference in April. Of course the biggest news in this area is the recent launch of the Responsible Metrics Forum by a group of research funders, sector bodies and infrastructure experts.  We’re on our way! And 2017 should see an even greater set of resources available to practitioners wanting to establish responsible metrics principles and practices at their institution.

Job Opportunity: Copyright and Scholarly Communications Manager, University of Salford

We are currently advertising for a permanent full time Copyright and Scholarly Communications Manager at the University of Salford.

The University Library is looking to appoint a highly motivated individual to the position of Copyright and Scholarly Communications Manager. You will have responsibility for the management and development of our copyright service and the copyright licence for the University of Salford. You will ensure that the University complies with the law through appropriate copyright licensing agreements and that the Library offers an excellent training, advice and referral service on copyright to members of the University towards supporting this. You will ensure that our research outputs are copyright compliant and that Salford researchers are managing the copyright of their own outputs to maximum benefit of themselves, the University of Salford and the open access agenda the University is committed to. You will work collaboratively to maximise Salford’s engagement with the opportunities, challenges and changes taking place in the scholarly communications environment including open access, research information systems and repositories, research data management, digital scholarship and digital humanities. You will develop and deliver outreach programmes to build engagement and compliance with University of Salford and external funder policies and requirements. This role provides key support towards achieving Salford’s strategic aspirations for the next Research Excellence Framework. You must have excellent communication skills, be confident in working with academic and professional services colleagues, and able to manage your time and prioritize tasks effectively. You will be highly organized, flexible, innovative and evidence-based, with a passion for improving services and the student and staff experience at Salford. You will thrive in a dynamic academic environment, adapting to new roles and working practices as needs emerge, and contribute effectively and creatively to wider library and university planning and delivery. You will have an excellent understanding of copyright law, fair use, fair dealing, intellectual property rights and specific licensing arrangements and a good understanding of the wider scholarly communications landscape. Organisational Unit – The Library

Grade 7

Salary – £31,655 – £37,768

Contract type – Permanent

Hours – Full time

Closing date – 4 Sep 2016

Location – Main Salford Campus

 

For further specific details and to apply, please go to the University of Salford’s job pages.

Informal enquiries can be made to Helen McEvoy on (0161) 2952445 or email h.mcevoy@salford.ac.uk

Regards,

Helen McEvoy

 

Helen McEvoy 

Academic Support Manager (Research)  |  The Library

Room 210, Clifford Whitworth Library, University of Salford, Salford  M5 4WT

t: +44 (0) 161 2952445

h.mcevoy@salford.ac.uk | www.salford.ac.uk

Why should a bibliometrician engage with altmetrics? Guest Post by Natalia Madjarevic

Last month, Barack Obama published an article in the journal JAMA discussing progress to date with The Affordable Care Act – or Obamacare – and outlining recommendations for future policy makers. Obama’s article was picked up in the press and across social media immediately. We can see in the Altmetric Details Page that it was shared across a broad range of online attentAM1ion sources such as mainstream media, Twitter, Facebook, Wikipedia and commented on by several research blogs. We can also see from the stats provided by JAMA that the article, at time of writing, has been viewed over 1 million times and has an Altmetric Attention Score of 7539, but hasn’t yet received a single citation.

Providing instant feedback

Many altmetrics providers track attention to a research output as soon as it’s available online. This means institutions can then use altmetrics data to monitor research engagement right away, without the delay we often see in the citation feedback loop.

If President Obama was checking his Altmetric Details Page (which I hope he did!) he’d have known almost in real-time exactly who was saying what about his article. In the same way, academic research from your institution is generating online activity  – probably right now – and can provide extra insights to help enhance your bibliometric reporting.

AM2

Altmetric, which has tracked mentions and shares of over 5.4m individual research outputs to date, sees 360 mentions per minute – a huge amount of online activity that can be monitored and reported on to help evidence additional signals of institutional research impact. That said, altmetrics are not designed to replace traditional measures such as citations and peer-review and it’s valuable report on a broad range of indicators. Altmetrics are complementary rather than a replacement for traditional bibliometrics.

Altmetrics reporting: context is key

Using a single number, “This output received 100 citations” or “This output has an Altmetric Attention Score of 100” doesn’t really say that much. That’s why altmetrics tools often focus on pulling out the qualitative data, i.e. the underlying mentions an output has received. Saying, “This output has an Altmetric Attention Score of 100, was referenced in a policy document, tweeted by a medical practitioner and shared on Facebook by a think tank” is much more meaningful than a single number. It also tells a much more compelling story about the influence and societal reach of your research. So when using altmetrics data, zoom in and take a look at the mentions. That’s where you’ll find the interesting stories about your research attention to include in your reporting.

How can you use altmetrics to extend your bibliometrics service?

Here are some ideas:

  • Include altmetrics data in your monthly bibliometric reports to demonstrate societal research engagement – pull out some qualitative highlights
  • Embed altmetrics in your bibliometrics training sessions and welcome emails to new faculty – we have lots of slides you can re-use here
  • Provide advice to researchers on how to promote themselves online and embed altmetrics data in their CV
  • Encourage responsible use of metrics as discussed in the Leiden Manifesto and The Metric Tide
  • Don’t use altmetrics as a predictor for citations! Use them instead to gain a more well-rounded, coherent insight into engagement and dissemination of your research

Altmetrics offer an opportunity for bibliometricians to extend existing services and provide researchers with a more granular and informative data about engagement with their research. The first step is to start exploring the data – from there you can determine how it will fit best into your current workflow and activities.

Further reading

Natalia Madjarevic

@nataliafay

A useful tool for librarians: metrics knowledge in bite-sized pieces By Jenny Delasalle

Metrics_poster_verticalHaving worked in UK academic libraries for 15 years before becoming freelance, I saw the rise and rise of citation counting (although as Geoffrey Bilder points out, it should rightly be called reference counting). Such counting, I learnt, was called “bibliometrics”. The very name sounds like something that librarians should be interested in if not expert at, and so I delved into what they were and how they might help me and also the users of academic libraries. It began with the need to select which journals to subscribe to, and it became a filter for readers to select which papers to read. Somewhere along the road, it became a measurement of individual researchers, and a component of university rankings: such metrics were gaining attention.

Then along came altmetrics, offering tantalising glimpses of something more than the numbers: real stories of impact that could be found through online tracking. Context was clearly key with these alternative metrics, and the doors were opened wide as to what could be measured and how.

It is no surprise that after working in subject support roles I became first an innovation officer, then institutional repository manager and then a research support manager: my knowledge of and interest in such metrics was useful in that context. Yet I was mostly self-taught: I learnt through playing with new tools and technologies, by attending training sessions from product providers and by reading in the published literature. I’m still learning. The field of scholarly metrics moves on quickly as new papers are published and new tools are launched onto the market. Also as university management and funders become more interested in the field and scholars themselves respond.

It took a lot of time and effort for me to learn this way, which was appropriate for my career path but it cannot be expected of all librarians. For example, subject or liaison librarians work with scholars directly and those scholars might also be interested in metrics, especially those available on platforms that the library subscribes to. Yet these same librarians must also quickly become experts in changing open access practices and data management needs and other concerns of their scholarly population, whilst teaching information literacy to undergraduates and maintaining the library’s book collections in their subject area and their own knowledge of the disciplines that they support. They have a lot of areas of expertise to keep up to date, as well as a lot of work to do. And there are new, trainee librarians who have a lot to learn from our profession. How can we save their time?

I began collaborating with Library Connect because that’s exactly what they seek to do, support busy librarians. Colleen DeLory, the editor of Library Connect, has her ear to the ground regarding librarians’ needs and she has some great ideas about what we could use. I started by presenting in a webinar “Librarians and altmetrics: tools, tips and use cases”, and I went on to do the research behind an infographic “Librarians and Research Impact,” about the role of a librarian in supporting research impact. Another webinar came along “Research impact metrics for librarians: calculation & context” and then the very latest and, in my opinion, most useful output of my work with Elsevier is our poster on research metrics.

Quick Reference Cards for Research Impact Metrics

This beautifully illustrated set can be printed as a poster which makes a great starting point for anyone new to such metrics, or indeed anyone seeking to de-tangle the very complex picture of metrics that they have been glimpsing for some years already! You could put it up in your library office or in corridors and you can also reproduce it on your intranet – just put a link back to the Library Connect as your source.

You can also print our set out as cards which would be really useful in training sessions. You could use them to form discussion groups by giving each participant a card and then asking people to form groups according to which card they have: is their metric one for authors, one for documents or one for journals? Some people will find that they belong to more than one group, of course! The groups could possibly then discuss the metrics that they have between them, sharing their wider knowledge about metrics as well as what is on the cards. Do the groups agree which metrics are suited to which purposes, as listed across the top of the poster? What else do they know or need to know about a metric? Beyond such a guided discussion, the cards could be sorted in order of suitability for a given purpose, perhaps sticking them onto a wall underneath a proposed purpose as a heading. The groups could even create their own “cards” for additional metrics to stick on the wall(s!), then the groups would visit each other’s listings after discussion… We’d love to hear about how you’re using the cards: do leave a comment for us over at Library Connect.

Of course our set is not comprehensive: there are lots of other metrics, but the ones chosen are perhaps those that librarians will most frequently come across. The aspects of the metrics that are presented on the poster/cards were also carefully chosen. We’ve suggested the kind of contexts in which a librarian might turn to each metric. We’ve carefully crafted definitions of metrics, and provided useful links to further information. And we’ve introduced the kind of groupings that each metric applies to, be it single papers or all of an author’s output, or for a serial publication. It was a truly collaborative output, with brainstorming of the initial idea, research from me and then over to Colleen DeLory to coordinate the graphics and internal review by Elsevier metrics expert Lisa Colledge, back to me to check it over, then with Library Connect again for proofreading and even a preview for a focus group of librarians. It has been a thorough production and I’m very proud to have been involved in something that I believe is truly useful.

@JennyDelasalle

Freelance Librarian, Instructor & Copywriter

Job: Scholarly Communications Librarian

Scholarly Communications Librarian, University College Dublin (UCD) Library (permanent)
Applicants are invited to apply for the permanent role of Scholarly Communications Librarian within UCD Library.  The successful candidate will demonstrate an enthusiasm for the changing information landscape and the ongoing development of library services including the implementation of new processes and services. This is a developing area and offers great scope for an individual to provide leadership in this fast-changing arena. There will be opportunities for skills development specific to this post and also in related areas to support the overall work of the Research Services unit. See job description for further details. Salary Scale: €35,043 – €50,446 per annum Appointment will be made on scale and in accordance with the Department of Finance guidelines. Closing Date: 17:00 GMT on 15 July 2016 More details are available at: https://hrportal.ucd.ie/pls/corehrrecruit/erq_jobspec_version_4.display_form

Is signing DORA that responsible?

img_1492-3
Tape measure by Marcy Leigh CC-BY-NC-SA

So, I’ve just returned from the excellent JISC/CNI conference on advances in digital scholarship where they had a panel session on metrics.  Once again, there were tutting noises from the front about the “disappointing” number of UK universities that hadn’t signed up to DORA.  Stephen Curry’s latest blog post urging universities to sign was highlighted and others even tweeted that the lack of UK university engagement with DORA was a “disgrace”.

 

Now I come from one of those “disgraceful”universities who haven’t signed.  To be strictly accurate, we’ve given DORA serious consideration and decided not to sign.  And at the root of our decision was not a feeling that DORA goes too far, but that it doesn’t go far enough.  Sure, DORA addresses a false reliance on the Journal Impact Factor but it makes no mention of a whole host of other important principles around the responsible use of metrics as outlined in the Leiden Principles.  As I’ve written in another blog post, were the Leiden Principles available for signing, I wouldn’t hesitate.

So, instead of signing DORA we have formed a working group of senior managers and academic leads to give proper consideration to the broad range of issues around the responsible use of metrics.  We have used the measured and wide-ranging Leiden Principles as our guide.  We have done our own analyses.  We have thought deeply about the university’s strategic aims and principles, and what impact of the use of bibliometrics might have on those. We have generated a list of indicators that we think could be applied most fairly, and written detailed caveats about their use.  It’s been a lot of work.  A lot more work than entering the words “Loughborough University” into a web site a clicking send.  And I know from working amongst a community of other Bibliometricians around the UK that Loughborough is not alone in taking this steep and thorny way in favour of the primrose path of dalliance.

Thus, I think we need to be careful not to read too much into the absence of a particular university’s name on the list of DORA signatories.  Yes, it might be that some institutions have not given the responsible use of metrics a moments thought, but others will have given it a lot more thought than those who have blithely signed DORA.  At the end of the day we need to ask whether signing the DORA principles will give us a better outcome than developing our own locally relevant, properly debated, carefully implemented and monitored principles.  And for us, it is undoubtedly the latter.

Having run a survey on the adoption of DORA last year, we’re now keen to see what advances the UK university community have made along the road to developing statements around the responsible use of metrics.  Whatever situation you find yourself in we’d welcome your participation in a brief 5-question survey: https://www.surveymonkey.co.uk/r/BSP6QH8

The results will be shared in a future blog post.

Thank you!

Elizabeth Gadd