My double life: playing and changing the scholarly communications game. By Lizzie Gadd

I love my job.  I work as a “Research Policy Manager (Publications)” at Loughborough University.  And I spend my time understanding and advising on how we can improve the quality and visibility of our research.  However the strategies for achieving this aren’t as straightforward as you might think.  And increasingly I feel like I’m leading a double life, both seeking to play and change the scholarly communication game.

 communication-by-jackie-finn-irwin-cc-by-2-0

‘Communication’ by Jackie Finn-Irwin CC-BY 2.0

 

What do I mean by this?  Well, the game we’re in is one where publications mean prizes. If others rate them (e.g. in the REF) or cite them (as measured by the University League Tables), you win. To be a winner, we know you need to produce quality research (of course), collaborate internationally (improves quality and visibility), and publish in those journals that are indexed by the tools that both expose your research to the world, but importantly, also do the citation measuring for the aforesaid REF and University League Tables.  And although there is a huge backlash against using journal metrics as an indicator of the quality of the underlying research, there is no doubt that getting a paper into a journal with extremely rigorous quality standards still means something to academics and their peers.

So the current game is inherently tied up with journal publications.  And there are two well-rehearsed reasons why this is not a good thing. The first is that journals are expensive – and getting more expensive. The second reason is that journals are slow at communicating research results.  Publication delays of years are not uncommon. (There are of course other objections to journals, not least the murky waters about how an end-user may re-use journal content, but I won’t go into these here.)

This is why we need to change the game. And the best option we have for changing the game is to keep producing quality research and collaborating internationally, but to also create new means of scholarly communication that are neither expensive, nor slow.

Some might argue that you can have it both ways. Publish in a journal which has a liberal green open access policy.  This will allow you to provide immediate access to the research through the pre-print, and access to the peer reviewed research through the postprint .  And to be honest, this is the compromise we currently go for.  But this form of open access is showing no signs of reducing the cost of subscriptions .  And not all journals have liberal green open access policies. And not all academics want to release their preprint until it has been accepted by a journal, in case the paper is rejected – so this rather defeats the object.

Now there are many alternative models of scholarly communication that ARE inexpensive and speed up publication.  These include preprint archives or ‘diamond’ open access journals that charge neither the author to submit nor the reader to read.  However, the problem is that these are not picked up by the citation benchmarking tools.  This is either because they are not journals at all (preprint archives) or because they are new entries to the market so have yet to prove their significance in the field and be selected for inclusion.

So what does a Research Policy Manager (Publications) do?  Well, it seems to me like I have two equally unsatisfactory options.  The first is to continue to play the journals game in order to ensure the citedness of our research is captured by the key citation benchmarking tools, but encourage OA as a means for improving visibility and discoverability of our work.  Whilst this isn’t going to speed up publication or reduce our costs, I think there is something about the lure of a high quality journal may well be a driver of research quality – which is very important.

The second option is to dramatically change our focus on to new forms of scholarly communication that will speed up publication rates and reduce our costs, such as preprint archives and diamond OA journals.  And by so doing, we’d need to hope that the well-documented citation advantage for immediately open research will do its thing. And that when the research is considered by the REF, they really will just focus on the content as they promise, and not the reputation of the vehicle it is published in.  Always bearing in mind that any citations that the research does accrue will only be picked up by open tools such as Google Scholar and not the tools that supply the REF – or the league tables.

So to answer my own question – what does a Research Policy Manager advise in these circumstances? Personally, I try to live with one whilst lobbying for the other, and as far as possible seek to ameliorate any confusion faced by our academics.  This is easier said than done – certainly when faced with later career academics who can remember a time when research was optional and where you published was entirely your business.  To now be faced with a barage of advice around improving the quality, accessibility, visibility, and citedness of your work, bearing in mind that the routes to these are often in conflict with each other, is a constant source of agony for both them and me.

I recognise that we have to play the game. Our reputation depends on it.  But we also have to change the game and provide quicker and more affordable access to (re-usable) research results. At the risk of sounding over-dramatic, the future may depend on it.

 

Elizabeth Gadd

The Bibliomagician seeks honorary Blog Manager

The Bibliomagician blog seeks to provide comment and guidance for practitioners engaging with bibliometrics.  We now are seeking an enthusiastic volunteer to take The Bibliomagician Blog on to the next level!  The post-holder would sit on the ‘light-touch’ Lis-Bibliometrics Committee and be responsible for:

  • Soliciting relevant content for the Blog
  • Reviewing contributions
  • Scheduling blog posts
  • Managing the resources pages
  • Reporting back to the Committee

This would be a great opportunity for someone seeking to make their mark on the world of bibliometrics, whilst developing their bibliometric knowledge, social media and editorial skills within a friendly and supportive environment.  For more information or to express an interest in this role, please get in touch with Elizabeth Gadd Chair of the Lis-Bibliometrics Forum.

UK universities respond to the Metric Tide’s call to do metrics responsibly

In September 2015 I ran a short survey to see how institutions were responding to the Metric Tide report‘s call to take a responsible approach to metrics and to consider signing the San Francisco Declaration on Research Assessment (DORA). Only three of the survey respondents had signed DORA (although 5 were considering it), but nine respondents were thinking about developing their own set of principles for the responsible use of metrics. In the year that followed there was very little movement on signing DORA. However, support for one of the key elements of DORA – a backlash against the Journal Impact Factor – has grown, and with it further condemnation for HEIs who have not signed. However, as indicated in an earlier blog post not every institution that has failed to sign DORA has failed to do anything. With an ear to the ground via the Lis-Bibliometrics and ARMA Metric SIG lists, I was aware of a growing movement towards the development of in-house Statements of Responsible metrics. So, I put together a second survey to capture some of this activity.

measuring-tapeCredit: Laineys Repertoire CC-BY

The second survey had slightly more respondents (26 versus the original 22) but the low response rate still indicates that this is a fledgling area. This was further confirmed by the fact that none of the respondents were yet at a stage where they could say they had completed their own set of published principles (although one said they’d agreed some principles internally and were not intending to publish). However, compared to a year ago where just nine were thinking of developing their own principles, this year seven were now at a stage where they were actively developing these and a further five were thinking about it. Only two had considered developing their own principles and rejected the idea compared to five who had considered the idea of signing DORA and rejected it last year.

Of the 13 who were developing their own set of principles (or had considered this internally), no-one said they were basing these on DORA. Instead, over half (8) said they were using the Leiden Manifesto as the basis, and two indicated they were basing their work on the principles of another University. One respondent said the Metric Tide Responsible Metrics Framework was informing their thinking. Similar to last year the development of principles were being guided by a range of staff, most of which were from the Research Office, Senior Managers and Library, although in different proportions.

So, the sector is stirring, and The Bibliomagician blog is starting to document all the Statements of Responsible Metrics they can find. Other moves are also afoot to help universities to do metrics responsibly and well. The Association for Research Managers and Administrators (ARMA) is offering a course to its members on the Responsible Use of Metrics in May 2017, and the Lis-Bibliometrics Forum is planning an event on this theme in September 2017. The Lis-Bibliometrics Forum are also facilitating some Elsevier-sponsored work on developing a set of bibliometric competencies for use by practitioners to ensure the staff supporting metrics in their institutions have the skills and knowledge they need. This should report in January 2017 and a workshop is planned at the UKSG conference in April. Of course the biggest news in this area is the recent launch of the Responsible Metrics Forum by a group of research funders, sector bodies and infrastructure experts.  We’re on our way! And 2017 should see an even greater set of resources available to practitioners wanting to establish responsible metrics principles and practices at their institution.

Job Opportunity: Copyright and Scholarly Communications Manager, University of Salford

We are currently advertising for a permanent full time Copyright and Scholarly Communications Manager at the University of Salford.

The University Library is looking to appoint a highly motivated individual to the position of Copyright and Scholarly Communications Manager. You will have responsibility for the management and development of our copyright service and the copyright licence for the University of Salford. You will ensure that the University complies with the law through appropriate copyright licensing agreements and that the Library offers an excellent training, advice and referral service on copyright to members of the University towards supporting this. You will ensure that our research outputs are copyright compliant and that Salford researchers are managing the copyright of their own outputs to maximum benefit of themselves, the University of Salford and the open access agenda the University is committed to. You will work collaboratively to maximise Salford’s engagement with the opportunities, challenges and changes taking place in the scholarly communications environment including open access, research information systems and repositories, research data management, digital scholarship and digital humanities. You will develop and deliver outreach programmes to build engagement and compliance with University of Salford and external funder policies and requirements. This role provides key support towards achieving Salford’s strategic aspirations for the next Research Excellence Framework. You must have excellent communication skills, be confident in working with academic and professional services colleagues, and able to manage your time and prioritize tasks effectively. You will be highly organized, flexible, innovative and evidence-based, with a passion for improving services and the student and staff experience at Salford. You will thrive in a dynamic academic environment, adapting to new roles and working practices as needs emerge, and contribute effectively and creatively to wider library and university planning and delivery. You will have an excellent understanding of copyright law, fair use, fair dealing, intellectual property rights and specific licensing arrangements and a good understanding of the wider scholarly communications landscape. Organisational Unit – The Library

Grade 7

Salary – £31,655 – £37,768

Contract type – Permanent

Hours – Full time

Closing date – 4 Sep 2016

Location – Main Salford Campus

 

For further specific details and to apply, please go to the University of Salford’s job pages.

Informal enquiries can be made to Helen McEvoy on (0161) 2952445 or email h.mcevoy@salford.ac.uk

Regards,

Helen McEvoy

 

Helen McEvoy 

Academic Support Manager (Research)  |  The Library

Room 210, Clifford Whitworth Library, University of Salford, Salford  M5 4WT

t: +44 (0) 161 2952445

h.mcevoy@salford.ac.uk | www.salford.ac.uk

Why should a bibliometrician engage with altmetrics? Guest Post by Natalia Madjarevic

Last month, Barack Obama published an article in the journal JAMA discussing progress to date with The Affordable Care Act – or Obamacare – and outlining recommendations for future policy makers. Obama’s article was picked up in the press and across social media immediately. We can see in the Altmetric Details Page that it was shared across a broad range of online attentAM1ion sources such as mainstream media, Twitter, Facebook, Wikipedia and commented on by several research blogs. We can also see from the stats provided by JAMA that the article, at time of writing, has been viewed over 1 million times and has an Altmetric Attention Score of 7539, but hasn’t yet received a single citation.

Providing instant feedback

Many altmetrics providers track attention to a research output as soon as it’s available online. This means institutions can then use altmetrics data to monitor research engagement right away, without the delay we often see in the citation feedback loop.

If President Obama was checking his Altmetric Details Page (which I hope he did!) he’d have known almost in real-time exactly who was saying what about his article. In the same way, academic research from your institution is generating online activity  – probably right now – and can provide extra insights to help enhance your bibliometric reporting.

AM2

Altmetric, which has tracked mentions and shares of over 5.4m individual research outputs to date, sees 360 mentions per minute – a huge amount of online activity that can be monitored and reported on to help evidence additional signals of institutional research impact. That said, altmetrics are not designed to replace traditional measures such as citations and peer-review and it’s valuable report on a broad range of indicators. Altmetrics are complementary rather than a replacement for traditional bibliometrics.

Altmetrics reporting: context is key

Using a single number, “This output received 100 citations” or “This output has an Altmetric Attention Score of 100” doesn’t really say that much. That’s why altmetrics tools often focus on pulling out the qualitative data, i.e. the underlying mentions an output has received. Saying, “This output has an Altmetric Attention Score of 100, was referenced in a policy document, tweeted by a medical practitioner and shared on Facebook by a think tank” is much more meaningful than a single number. It also tells a much more compelling story about the influence and societal reach of your research. So when using altmetrics data, zoom in and take a look at the mentions. That’s where you’ll find the interesting stories about your research attention to include in your reporting.

How can you use altmetrics to extend your bibliometrics service?

Here are some ideas:

  • Include altmetrics data in your monthly bibliometric reports to demonstrate societal research engagement – pull out some qualitative highlights
  • Embed altmetrics in your bibliometrics training sessions and welcome emails to new faculty – we have lots of slides you can re-use here
  • Provide advice to researchers on how to promote themselves online and embed altmetrics data in their CV
  • Encourage responsible use of metrics as discussed in the Leiden Manifesto and The Metric Tide
  • Don’t use altmetrics as a predictor for citations! Use them instead to gain a more well-rounded, coherent insight into engagement and dissemination of your research

Altmetrics offer an opportunity for bibliometricians to extend existing services and provide researchers with a more granular and informative data about engagement with their research. The first step is to start exploring the data – from there you can determine how it will fit best into your current workflow and activities.

Further reading

Natalia Madjarevic

@nataliafay

A useful tool for librarians: metrics knowledge in bite-sized pieces By Jenny Delasalle

Metrics_poster_verticalHaving worked in UK academic libraries for 15 years before becoming freelance, I saw the rise and rise of citation counting (although as Geoffrey Bilder points out, it should rightly be called reference counting). Such counting, I learnt, was called “bibliometrics”. The very name sounds like something that librarians should be interested in if not expert at, and so I delved into what they were and how they might help me and also the users of academic libraries. It began with the need to select which journals to subscribe to, and it became a filter for readers to select which papers to read. Somewhere along the road, it became a measurement of individual researchers, and a component of university rankings: such metrics were gaining attention.

Then along came altmetrics, offering tantalising glimpses of something more than the numbers: real stories of impact that could be found through online tracking. Context was clearly key with these alternative metrics, and the doors were opened wide as to what could be measured and how.

It is no surprise that after working in subject support roles I became first an innovation officer, then institutional repository manager and then a research support manager: my knowledge of and interest in such metrics was useful in that context. Yet I was mostly self-taught: I learnt through playing with new tools and technologies, by attending training sessions from product providers and by reading in the published literature. I’m still learning. The field of scholarly metrics moves on quickly as new papers are published and new tools are launched onto the market. Also as university management and funders become more interested in the field and scholars themselves respond.

It took a lot of time and effort for me to learn this way, which was appropriate for my career path but it cannot be expected of all librarians. For example, subject or liaison librarians work with scholars directly and those scholars might also be interested in metrics, especially those available on platforms that the library subscribes to. Yet these same librarians must also quickly become experts in changing open access practices and data management needs and other concerns of their scholarly population, whilst teaching information literacy to undergraduates and maintaining the library’s book collections in their subject area and their own knowledge of the disciplines that they support. They have a lot of areas of expertise to keep up to date, as well as a lot of work to do. And there are new, trainee librarians who have a lot to learn from our profession. How can we save their time?

I began collaborating with Library Connect because that’s exactly what they seek to do, support busy librarians. Colleen DeLory, the editor of Library Connect, has her ear to the ground regarding librarians’ needs and she has some great ideas about what we could use. I started by presenting in a webinar “Librarians and altmetrics: tools, tips and use cases”, and I went on to do the research behind an infographic “Librarians and Research Impact,” about the role of a librarian in supporting research impact. Another webinar came along “Research impact metrics for librarians: calculation & context” and then the very latest and, in my opinion, most useful output of my work with Elsevier is our poster on research metrics.

Quick Reference Cards for Research Impact Metrics

This beautifully illustrated set can be printed as a poster which makes a great starting point for anyone new to such metrics, or indeed anyone seeking to de-tangle the very complex picture of metrics that they have been glimpsing for some years already! You could put it up in your library office or in corridors and you can also reproduce it on your intranet – just put a link back to the Library Connect as your source.

You can also print our set out as cards which would be really useful in training sessions. You could use them to form discussion groups by giving each participant a card and then asking people to form groups according to which card they have: is their metric one for authors, one for documents or one for journals? Some people will find that they belong to more than one group, of course! The groups could possibly then discuss the metrics that they have between them, sharing their wider knowledge about metrics as well as what is on the cards. Do the groups agree which metrics are suited to which purposes, as listed across the top of the poster? What else do they know or need to know about a metric? Beyond such a guided discussion, the cards could be sorted in order of suitability for a given purpose, perhaps sticking them onto a wall underneath a proposed purpose as a heading. The groups could even create their own “cards” for additional metrics to stick on the wall(s!), then the groups would visit each other’s listings after discussion… We’d love to hear about how you’re using the cards: do leave a comment for us over at Library Connect.

Of course our set is not comprehensive: there are lots of other metrics, but the ones chosen are perhaps those that librarians will most frequently come across. The aspects of the metrics that are presented on the poster/cards were also carefully chosen. We’ve suggested the kind of contexts in which a librarian might turn to each metric. We’ve carefully crafted definitions of metrics, and provided useful links to further information. And we’ve introduced the kind of groupings that each metric applies to, be it single papers or all of an author’s output, or for a serial publication. It was a truly collaborative output, with brainstorming of the initial idea, research from me and then over to Colleen DeLory to coordinate the graphics and internal review by Elsevier metrics expert Lisa Colledge, back to me to check it over, then with Library Connect again for proofreading and even a preview for a focus group of librarians. It has been a thorough production and I’m very proud to have been involved in something that I believe is truly useful.

@JennyDelasalle

Freelance Librarian, Instructor & Copywriter

Job: Scholarly Communications Librarian

Scholarly Communications Librarian, University College Dublin (UCD) Library (permanent)
Applicants are invited to apply for the permanent role of Scholarly Communications Librarian within UCD Library.  The successful candidate will demonstrate an enthusiasm for the changing information landscape and the ongoing development of library services including the implementation of new processes and services. This is a developing area and offers great scope for an individual to provide leadership in this fast-changing arena. There will be opportunities for skills development specific to this post and also in related areas to support the overall work of the Research Services unit. See job description for further details. Salary Scale: €35,043 – €50,446 per annum Appointment will be made on scale and in accordance with the Department of Finance guidelines. Closing Date: 17:00 GMT on 15 July 2016 More details are available at: https://hrportal.ucd.ie/pls/corehrrecruit/erq_jobspec_version_4.display_form