On Wednesday, 15th June I had the pleasure of attending a Jisc-supported Thomson-Reuters one-day event at Edinburgh University Library on the theory, and practice of bibliometric analysis for evaluating research and planning research policy. The morning was spent on the theory, and there were practical hands-on sessions in the afternoon, based on exercises that were given out to us.
As it was based upon the Jisc-supported Web of Science team, it was not surprising that all the examples, and the exercises, were based on Web of Science, Journal Citation Reports and Incites. However, this was definitely not a sales pitch for Thomson-Reuters services. The speakers were at pains to point out that the theory, and the practice applied to any value-added service – in other words, that what we learned applied equally well to Scopus and SciVal. There was clear implied criticism, though, of the free of charge, non-value added Publish or Perish + Google Scholar approach.
The presentations were somewhat rushed, and assumed a reasonable amount of prior knowledge. I got the impression that the audience, primarily University of Edinburgh staff but including a few outsiders like me, were OK with the level of knowledge assumed, but I felt that the presenters were trying to pack too much in. In the practical sessions, the presenters wandered around the delegates, each of whom had a terminal connected to Incites in front of them, making sure they were progressing well. That part worked very well. There was a Q & A session at the end, but I had to leave early to catch a train, so missed that.
So what did I learn? That the amount of value added, in terms of both manual and automated correcting of errors and of inconsistencies in source articles, and cited references, is impressive, and accounts no doubt for a significant part of the cost of these services. I also learned that InCites (and I believe, SciVal) has an impressively large range of calculations it can do and a great choice in the way the results can be presented. Indeed, I would argue that InCites has too much to offer – the user interface can be confusing at times and is sometimes a bit inconsistent. The best part was the morning basics. There were health warnings a-plenty. Make sure the data you collect and analyse is what you need; don’t use single measures when you can obtain several; always normalise your results against what is the average for that subject area, for that country, for that time period. Don’t depend on things like the Journal Impact Factor or the h index in isolation to evaluate people or research. Be aware of the limitations of the h index. It was not just that this was sensible advice; it was all the more impressive because these are the people who you might expect could over-sell these things. The presenters didn’t actually apologise for ISI inventing the Journal Impact Factor, but they came close to it!
We were given handouts of the slides and the exercises, but it would have been nice to have been provided some of the PowerPoint slides in a format that could be re-used by delegates when explaining bibliometrics to colleagues. Some reduction in the introductory slides to allow more time for discussion would have been good. Also, a few slides shown in the handouts were not shown on the screen, and a few new slides appeared on screen that were not the handouts. An attendance list would also have been helpful. But these are minor quibbles. Overall, this was a worthwhile day and the presenters should be congratulated for their hard work.