My double life: playing and changing the scholarly communications game. By Lizzie Gadd

I love my job.  I work as a “Research Policy Manager (Publications)” at Loughborough University.  And I spend my time understanding and advising on how we can improve the quality and visibility of our research.  However the strategies for achieving this aren’t as straightforward as you might think.  And increasingly I feel like I’m leading a double life, both seeking to play and change the scholarly communication game.

 communication-by-jackie-finn-irwin-cc-by-2-0

‘Communication’ by Jackie Finn-Irwin CC-BY 2.0

 

What do I mean by this?  Well, the game we’re in is one where publications mean prizes. If others rate them (e.g. in the REF) or cite them (as measured by the University League Tables), you win. To be a winner, we know you need to produce quality research (of course), collaborate internationally (improves quality and visibility), and publish in those journals that are indexed by the tools that both expose your research to the world, but importantly, also do the citation measuring for the aforesaid REF and University League Tables.  And although there is a huge backlash against using journal metrics as an indicator of the quality of the underlying research, there is no doubt that getting a paper into a journal with extremely rigorous quality standards still means something to academics and their peers.

So the current game is inherently tied up with journal publications.  And there are two well-rehearsed reasons why this is not a good thing. The first is that journals are expensive – and getting more expensive. The second reason is that journals are slow at communicating research results.  Publication delays of years are not uncommon. (There are of course other objections to journals, not least the murky waters about how an end-user may re-use journal content, but I won’t go into these here.)

This is why we need to change the game. And the best option we have for changing the game is to keep producing quality research and collaborating internationally, but to also create new means of scholarly communication that are neither expensive, nor slow.

Some might argue that you can have it both ways. Publish in a journal which has a liberal green open access policy.  This will allow you to provide immediate access to the research through the pre-print, and access to the peer reviewed research through the postprint .  And to be honest, this is the compromise we currently go for.  But this form of open access is showing no signs of reducing the cost of subscriptions .  And not all journals have liberal green open access policies. And not all academics want to release their preprint until it has been accepted by a journal, in case the paper is rejected – so this rather defeats the object.

Now there are many alternative models of scholarly communication that ARE inexpensive and speed up publication.  These include preprint archives or ‘diamond’ open access journals that charge neither the author to submit nor the reader to read.  However, the problem is that these are not picked up by the citation benchmarking tools.  This is either because they are not journals at all (preprint archives) or because they are new entries to the market so have yet to prove their significance in the field and be selected for inclusion.

So what does a Research Policy Manager (Publications) do?  Well, it seems to me like I have two equally unsatisfactory options.  The first is to continue to play the journals game in order to ensure the citedness of our research is captured by the key citation benchmarking tools, but encourage OA as a means for improving visibility and discoverability of our work.  Whilst this isn’t going to speed up publication or reduce our costs, I think there is something about the lure of a high quality journal may well be a driver of research quality – which is very important.

The second option is to dramatically change our focus on to new forms of scholarly communication that will speed up publication rates and reduce our costs, such as preprint archives and diamond OA journals.  And by so doing, we’d need to hope that the well-documented citation advantage for immediately open research will do its thing. And that when the research is considered by the REF, they really will just focus on the content as they promise, and not the reputation of the vehicle it is published in.  Always bearing in mind that any citations that the research does accrue will only be picked up by open tools such as Google Scholar and not the tools that supply the REF – or the league tables.

So to answer my own question – what does a Research Policy Manager advise in these circumstances? Personally, I try to live with one whilst lobbying for the other, and as far as possible seek to ameliorate any confusion faced by our academics.  This is easier said than done – certainly when faced with later career academics who can remember a time when research was optional and where you published was entirely your business.  To now be faced with a barage of advice around improving the quality, accessibility, visibility, and citedness of your work, bearing in mind that the routes to these are often in conflict with each other, is a constant source of agony for both them and me.

I recognise that we have to play the game. Our reputation depends on it.  But we also have to change the game and provide quicker and more affordable access to (re-usable) research results. At the risk of sounding over-dramatic, the future may depend on it.

 

Elizabeth Gadd

Job Opportunity: University of Greenwich is looking for a Research Outputs Manager!

Greenwich Research and Enterprise

Location:  Greenwich
Salary:  £38,183 to £46,924 plus £4546 London weighting
Contract Type:  Open
Closing Date:  Monday 20 March 2017
Interview Date:  To be confirmed
Reference:  1346

Greenwich Research and Enterprise (GRE) is the University’s central office responsible for developing a supportive research culture and establishing links with industry and enterprise. GRE works across four service areas: research services, business development and enterprise services, commercial and IP services, and business support services.

The university is investing in expanding its research services and recognises high quality support is pivotal to its research environment and is now recruiting a Research Outputs Manager to join the GRE Research Development Services team at Greenwich.

This role will lead the development of library services as they relate to research outputs and research data management in order to meet the needs of the University’s research community, external research funders, and the requirements of the Research Excellence Framework. In particular, this will involve overseeing the ongoing development of the Institutional Repository – GALA (Greenwich Academic Literature Archive) – ensuring its effective use for Open Access requirements, and the development and implementation of a Research Data Management Policy & Framework.

Please see the Job Description & Person Specification for further details and please apply using the online application form.

 

The Bibliomagician seeks honorary Blog Manager

The Bibliomagician blog seeks to provide comment and guidance for practitioners engaging with bibliometrics.  We now are seeking an enthusiastic volunteer to take The Bibliomagician Blog on to the next level!  The post-holder would sit on the ‘light-touch’ Lis-Bibliometrics Committee and be responsible for:

  • Soliciting relevant content for the Blog
  • Reviewing contributions
  • Scheduling blog posts
  • Managing the resources pages
  • Reporting back to the Committee

This would be a great opportunity for someone seeking to make their mark on the world of bibliometrics, whilst developing their bibliometric knowledge, social media and editorial skills within a friendly and supportive environment.  For more information or to express an interest in this role, please get in touch with Elizabeth Gadd Chair of the Lis-Bibliometrics Forum.

UK universities respond to the Metric Tide’s call to do metrics responsibly

In September 2015 I ran a short survey to see how institutions were responding to the Metric Tide report‘s call to take a responsible approach to metrics and to consider signing the San Francisco Declaration on Research Assessment (DORA). Only three of the survey respondents had signed DORA (although 5 were considering it), but nine respondents were thinking about developing their own set of principles for the responsible use of metrics. In the year that followed there was very little movement on signing DORA. However, support for one of the key elements of DORA – a backlash against the Journal Impact Factor – has grown, and with it further condemnation for HEIs who have not signed. However, as indicated in an earlier blog post not every institution that has failed to sign DORA has failed to do anything. With an ear to the ground via the Lis-Bibliometrics and ARMA Metric SIG lists, I was aware of a growing movement towards the development of in-house Statements of Responsible metrics. So, I put together a second survey to capture some of this activity.

measuring-tapeCredit: Laineys Repertoire CC-BY

The second survey had slightly more respondents (26 versus the original 22) but the low response rate still indicates that this is a fledgling area. This was further confirmed by the fact that none of the respondents were yet at a stage where they could say they had completed their own set of published principles (although one said they’d agreed some principles internally and were not intending to publish). However, compared to a year ago where just nine were thinking of developing their own principles, this year seven were now at a stage where they were actively developing these and a further five were thinking about it. Only two had considered developing their own principles and rejected the idea compared to five who had considered the idea of signing DORA and rejected it last year.

Of the 13 who were developing their own set of principles (or had considered this internally), no-one said they were basing these on DORA. Instead, over half (8) said they were using the Leiden Manifesto as the basis, and two indicated they were basing their work on the principles of another University. One respondent said the Metric Tide Responsible Metrics Framework was informing their thinking. Similar to last year the development of principles were being guided by a range of staff, most of which were from the Research Office, Senior Managers and Library, although in different proportions.

So, the sector is stirring, and The Bibliomagician blog is starting to document all the Statements of Responsible Metrics they can find. Other moves are also afoot to help universities to do metrics responsibly and well. The Association for Research Managers and Administrators (ARMA) is offering a course to its members on the Responsible Use of Metrics in May 2017, and the Lis-Bibliometrics Forum is planning an event on this theme in September 2017. The Lis-Bibliometrics Forum are also facilitating some Elsevier-sponsored work on developing a set of bibliometric competencies for use by practitioners to ensure the staff supporting metrics in their institutions have the skills and knowledge they need. This should report in January 2017 and a workshop is planned at the UKSG conference in April. Of course the biggest news in this area is the recent launch of the Responsible Metrics Forum by a group of research funders, sector bodies and infrastructure experts.  We’re on our way! And 2017 should see an even greater set of resources available to practitioners wanting to establish responsible metrics principles and practices at their institution.

Job Opportunity: Copyright and Scholarly Communications Manager, University of Salford

We are currently advertising for a permanent full time Copyright and Scholarly Communications Manager at the University of Salford.

The University Library is looking to appoint a highly motivated individual to the position of Copyright and Scholarly Communications Manager. You will have responsibility for the management and development of our copyright service and the copyright licence for the University of Salford. You will ensure that the University complies with the law through appropriate copyright licensing agreements and that the Library offers an excellent training, advice and referral service on copyright to members of the University towards supporting this. You will ensure that our research outputs are copyright compliant and that Salford researchers are managing the copyright of their own outputs to maximum benefit of themselves, the University of Salford and the open access agenda the University is committed to. You will work collaboratively to maximise Salford’s engagement with the opportunities, challenges and changes taking place in the scholarly communications environment including open access, research information systems and repositories, research data management, digital scholarship and digital humanities. You will develop and deliver outreach programmes to build engagement and compliance with University of Salford and external funder policies and requirements. This role provides key support towards achieving Salford’s strategic aspirations for the next Research Excellence Framework. You must have excellent communication skills, be confident in working with academic and professional services colleagues, and able to manage your time and prioritize tasks effectively. You will be highly organized, flexible, innovative and evidence-based, with a passion for improving services and the student and staff experience at Salford. You will thrive in a dynamic academic environment, adapting to new roles and working practices as needs emerge, and contribute effectively and creatively to wider library and university planning and delivery. You will have an excellent understanding of copyright law, fair use, fair dealing, intellectual property rights and specific licensing arrangements and a good understanding of the wider scholarly communications landscape. Organisational Unit – The Library

Grade 7

Salary – £31,655 – £37,768

Contract type – Permanent

Hours – Full time

Closing date – 4 Sep 2016

Location – Main Salford Campus

 

For further specific details and to apply, please go to the University of Salford’s job pages.

Informal enquiries can be made to Helen McEvoy on (0161) 2952445 or email h.mcevoy@salford.ac.uk

Regards,

Helen McEvoy

 

Helen McEvoy 

Academic Support Manager (Research)  |  The Library

Room 210, Clifford Whitworth Library, University of Salford, Salford  M5 4WT

t: +44 (0) 161 2952445

h.mcevoy@salford.ac.uk | www.salford.ac.uk

Is signing DORA that responsible?

img_1492-3
Tape measure by Marcy Leigh CC-BY-NC-SA

So, I’ve just returned from the excellent JISC/CNI conference on advances in digital scholarship where they had a panel session on metrics.  Once again, there were tutting noises from the front about the “disappointing” number of UK universities that hadn’t signed up to DORA.  Stephen Curry’s latest blog post urging universities to sign was highlighted and others even tweeted that the lack of UK university engagement with DORA was a “disgrace”.

 

Now I come from one of those “disgraceful”universities who haven’t signed.  To be strictly accurate, we’ve given DORA serious consideration and decided not to sign.  And at the root of our decision was not a feeling that DORA goes too far, but that it doesn’t go far enough.  Sure, DORA addresses a false reliance on the Journal Impact Factor but it makes no mention of a whole host of other important principles around the responsible use of metrics as outlined in the Leiden Principles.  As I’ve written in another blog post, were the Leiden Principles available for signing, I wouldn’t hesitate.

So, instead of signing DORA we have formed a working group of senior managers and academic leads to give proper consideration to the broad range of issues around the responsible use of metrics.  We have used the measured and wide-ranging Leiden Principles as our guide.  We have done our own analyses.  We have thought deeply about the university’s strategic aims and principles, and what impact of the use of bibliometrics might have on those. We have generated a list of indicators that we think could be applied most fairly, and written detailed caveats about their use.  It’s been a lot of work.  A lot more work than entering the words “Loughborough University” into a web site a clicking send.  And I know from working amongst a community of other Bibliometricians around the UK that Loughborough is not alone in taking this steep and thorny way in favour of the primrose path of dalliance.

Thus, I think we need to be careful not to read too much into the absence of a particular university’s name on the list of DORA signatories.  Yes, it might be that some institutions have not given the responsible use of metrics a moments thought, but others will have given it a lot more thought than those who have blithely signed DORA.  At the end of the day we need to ask whether signing the DORA principles will give us a better outcome than developing our own locally relevant, properly debated, carefully implemented and monitored principles.  And for us, it is undoubtedly the latter.

Having run a survey on the adoption of DORA last year, we’re now keen to see what advances the UK university community have made along the road to developing statements around the responsible use of metrics.  Whatever situation you find yourself in we’d welcome your participation in a brief 5-question survey: https://www.surveymonkey.co.uk/r/BSP6QH8

The results will be shared in a future blog post.

Thank you!

Elizabeth Gadd

 

Event report: Theory & Practice of Bibliometric Analysis by Charles Oppenheim

On Wednesday, 15th June I had the pleasure of attending a Jisc-supported Thomson-Reuters one-day event at Edinburgh University Library on the theory, and practice of bibliometric analysis for evaluating research and planning research policy.  The morning was spent on the theory, and there were practical hands-on sessions in the afternoon, based on exercises that were given out to us.

As it was based upon the Jisc-supported Web of Science team, it was not surprising that all the examples, and the exercises, were based on Web of Science, Journal Citation Reports and Incites.  However, this was definitely not a sales pitch for Thomson-Reuters services.  The speakers were at pains to point out that the theory, and the practice applied to any value-added service – in other words, that what we learned applied equally well to Scopus and SciVal.  There was clear implied criticism, though, of the free of charge, non-value added Publish or Perish + Google Scholar approach.

The presentations were somewhat rushed, and assumed a reasonable amount of prior knowledge.  I got the impression that the audience, primarily University of Edinburgh staff but including a few outsiders like me, were OK with the level of knowledge assumed, but I felt that the presenters were trying to pack too much in.  In the practical sessions, the presenters wandered around the delegates, each of whom had a terminal connected to Incites in front of them, making sure they were progressing well.  That part worked very well. There was a Q & A session at the end, but I had to leave early to catch a train, so missed that.

So what did I learn?  That the amount of value added, in terms of both manual and automated correcting of errors and of inconsistencies in source articles, and cited references, is impressive, and accounts no doubt for a significant part of the cost of these services.  I also learned that InCites (and I believe, SciVal) has an impressively large range of calculations it can do and a great choice in the way the results can be presented.  Indeed, I would argue that InCites has too much to offer –  the user interface can be confusing at times and is sometimes a bit inconsistent.  The best part was the morning basics.  There were health warnings a-plenty. Make sure the data you collect and analyse is what you need; don’t use single measures when you can obtain several; always normalise your results against what is the average for that subject area, for that country, for that time period.  Don’t depend on things like the Journal Impact Factor or the h index in isolation to evaluate people or research.  Be aware of the limitations of the h index.  It was not just that this was sensible advice; it was all the more impressive because these are the people who you might expect could over-sell these things.  The presenters didn’t actually apologise for ISI inventing the Journal Impact Factor, but they came close to it!

We were given handouts of the slides and the exercises, but it would have been nice to have been provided some of the PowerPoint slides in a format that could be re-used by delegates when explaining bibliometrics to colleagues.  Some reduction in the introductory slides to allow more time for discussion would have been good.  Also, a few slides shown in the handouts were not shown on the screen, and a few new slides appeared on screen that were not the handouts.  An attendance list would also have been helpful.  But these are minor quibbles.  Overall, this was a worthwhile day and the presenters should be congratulated for their hard work.

Charles Oppenheim