Showing posts with label citation analysis. Show all posts
Showing posts with label citation analysis. Show all posts

Saturday, 2 July 2011

Research evaluation - is it our business?

On Wednesday I attended and event laid on by the JIBS user group (users of bibliographic software). Somewhat provocative title but the aim of the day was to look at librarians role in research evaluation. There were some interesting presentations, ranging from evaluation of using Google Scholar for bibliometric data to implementing research information systems like PURE or Symplectic.

The event was introduced by Fiona Bowtell (chair of JIBS) and the day was chaired by Kara Jones. Here's my notes on some of the presentations:

Andria McGrath - King's College London - Keynote

Andria made the point that research evaluation is a "brave new world" and I guess that this is an emerging specialism for librarians. In the past research support was likely to be done by subject librarians but it was interesting to see quite a high number of specialist Research Support Librarians at this event. Andria went on to discuss the decreased importance that the REF 2014(?) places on bibliometric data and that impact (in terms of the REF) is a more general concept. Demonstrating some societal change is (quite rightly in my opinion) more important than citations. But there are areas where bibliometrics will inform panels. Andria described some initiatives that should make getting at bibliometric data easier - in particular the CERIF data model, which is a European standard for CRIS (Common European Research Information Format). This is an xml-based model that goes beyond metadata - allowing deeper description of the information needed to manage research. A CERIF4REF schema has been produced for ePrints that will allow easy porting of data in repositories for REF submissions.

Andria was the first of many speakers to mention that librarians are not always involved by their institutions and that we need to push ourselves forwards. She mentioned a RIN report "Value of libraries for research and researchers" that looks useful.

So CRIS, CERIF, new ways of using bibliometric citation APIs, discovering other systems (like grants) and bibliometrics products and principles are all new areas for librarians. Some institutions have engaged with this and there was discussion in the room of an institution (Surrey?) that have employed a bibliometrician in the library staff. There is some evidence that academics are starting to get interested in bibliometrics (I've certainly seen some interest around generating H index numbers!) and that some are using advanced techniques like using Essential Science Indicators or InCites to normalize results.

Andria also listed some of the things people want from this area: they do want a Research Information System, they do want a dashboard or simple interface that has all the information they need for grant bids; they want to capture data easily (pulling in bid data from databases and only entering data ONCE!); and they want integrated bibliometrics. All of these are things that the library can help with!

Jeremy Upton - St Andrews - Librarians involvement with CRIS

Jeremy made it clear from the outset that his involvement with CRIS has been positive and that it may not match everyone's experience. One of his first slides was about how CRIS represents the convergence of a number of areas that libraries support: OA; Impact; Research profile; and digital communication. An important point (I think) was that although Jeremy is positive about these things, developments have been from genuine desire from the academic community (rather than evangelising: it is based on dialog and recognition of a need). St Andrews was one of the first institutions to use the publications database as the front end for getting data into the repository (talking to people over lunch and coffee this now seems to be what alot of folks are working towards). I think the main point Andrew was making though was that partnerships are very important. Previous involvement with the research community at St Andrews (e.g. on digital thesis) meant that they came to him when they realised that they needed support implementing a CRIS, and as a result of the work, library staff have been transferred to the research office to support the REF (as they realise that librarians have skills that are needed). Along these lines academic liaison staff are seen as an important part of the process of rolling out and supporting these systems with academics - one of the hooks for this is by showing the academics stats of how their material has been accessed via the system - the thought of increased citations is a powerful motivator.

There followed some brief demos:

Sue Cumberpatch - York - PURE

This is not quite live at York but has been released to academics, who are currently adding publications. Academic liaison librarians are currently supporting academics asked to put material on at very short notice. PURE includes publications, media outputs like radio programmes, students supervised, etc. It's been populated by the repository and web of science (they bought the InCites API to support this), and has also brought together many sources (e.g. academics records) as York didn't have publications database before this. Librarians are also supporting inputting data from reference manamgnent software (e.g. endnote). Sue had a slide showing the whole system and identifying the areas where librarians have contributed and supported. A very useful slide! Unfortunately there are not many databases that can populate PURE (just Arxive, WoS, and pubmed) and social sciences are not well covered by these. PURE can import RIS files although the import filters can be a bit tricksy.

Andria McGrath - InCites demo


Nadine demoed Thomson Reuters InCites product that extends what you can do with the citation data in WoS. You can create your own data sets within InCites. King's have set one up based on author affiliation in WoS but also a more bespoke one based on documents in their CRIS. This uses the UT number of the record in the CRIS to link to the citation data. They've found that the tool acts as a real incentive for people to get their work into the CRIS (because they can then instantly access the citation data). It's useful for the REF as you can create subject area schemes that provide data for specific UoAs for the REF but it's also useful in other ways. You can compare your outputs to other institutions for example, which can be useful in future bids.

Nadine Edwards - Greenwich - e-resources and repository manager - survey on librarians and research.

Nadine presented the results of a JIBS survey. There were no responses from repository managers - which indicates that this is often tacked on to other roles (e.g. subject librarian). Responses showed that librarians had been consulted in less than 50% of CRIS acquisitions.

Half of the librarians surveyed support bibliometrics. Most popular form is 1:1 support with academics. Majority of people don't have InCites. I later found this surprising as it emerged that InCites can be used to normalise bibliometric data. Without a product like InCites it is very difficult to do this and, as bibliometric data varies greatly between fields and even from year to year it is meaningless without putting it into context by normalizing.

Kate Bradbury - Cardiff - Research support

Kate listed some drivers for research evaluation: external ones like: REF2014; impact/ value for money; performance/ league tables; grant applications/ awards, and internal factors like: promotion/ pay progression; evaluating initiatives; departmental performance; funding allocations. (I'd also add recruitment to this list). Helping to find evidence of impact might involve searches for researchers; verifying outputs for the REF; liaison; training; and advocacy. As a skillset it helps to have: knowledge of the research landscape (e.g. keeping on top of the many HEFCE REF reports!). Kate made the point that it is hard to provide normalized bibliometrics without a product like InCites.

Afternoon session

Jenny Delasale - Warwick - Supporting bibliometrics


Jenny explained her organisational context and made an interesting point about how their support for the publications database had been communicated to academics i.e. that they were introducing a publications service to support academics with the university's publications database. Very clever marketing.

Jenny made the point that there are lots of routes to researchers and recommended Web of Knowledge bibliometrics sessions through Mimas to develop skills in this area. There was a useful slide on how librarians can support the publication process. The bit I found really valuable about this presentation was Jenny's explanation of the h index and in particular the assertation that context is everything! Basically, bibliometrics depend so much on context that you often can't provide a yes or no answer to academics questions. The example given was is it better to have 1 paper cited 100 times or 10 papers cited 10 times? All depends on context.

Isobel Stark and Michael Whitton - Southampton - Google Scholar: Can it be used for bibliometrics?


This was an interesting talk. First was a discussion on the pro's and cons of using Google for bibliometrics:

Pros:
Easy to use and free. Wide range of articles e.g. book chapters and conference proceedings so it's especially useful for law, humanities, social sciences. Metrics tend to be a higher number because more material is indexed but also there is some duplication.
Cons:
Data can be poor quality and it is not transparent where the data comes from; there's a lack of de-dupe; there are big gaps in subject coverage; the indexing (esp. subject indexing) is naff and this makes it difficult to narrow to a particular authors work; citation matching can be flaky (relies on algorithms to do this).

Isobel listed some services that are good with scholar: Quadsearch; Scholar H-index calculator (FF add-on); Scholarometer. Publisher or perish.

There was also some useful references to the literature on the subject:
Bar-Illan (2008) - Israeli highly cited researchers - differences between Wos and scopus.
Jasco (2008) - problems with the data, issues around transparency.
Franceshet (2009) - computer science metrics higher as computer scientists publish in conference proceedings.
Lee (2009) - neurosurgery - Google scholar and scopus very close.
Mingers (2010) - Business and management - google metrics higher.

Isobel made the point that you need to know the work well as Scholar doesn't deduplicate and doesn't have institutional affiliation.

Southampton bibliometrics guides are open access www.soton.ac.uk/library/research/bibliometrics - interestingly the Google Scholar guide has been accessed far more than others!

The discussion on whether to use Google Scholar was useful. I thought it was an interesting point that if someone asks for a H index without specifying where it's come from then they probably don't understand what they are asking for - is it then fair game to give them the highest figure?

As an interesting aside the Thompson Reuters rep in the room disclosed that they are releasing a book chapter citation database in November.

All in all an interesting day and I learned alot- but my brain was a bit frazzled by the end :-)

Thursday, 24 June 2010

Some things I learnt today

ResearchGate looks like an interesting source. I stumbled across it after (I'm ashamed to say) doing a Google search for a thesis from University of Utah. The result showed as a page in ResearchGate's database which includes info from: Pubmed, Arxiv, Pubmed Central, IEEE, RePEc, CiteSeer, NASA library and DOAJ. In this case it linked out to the full text in University of Utah's institutional repository - which was nice. Aside from the database it also allows researchers to network using social media features. They've got a tour here and registration is free.

Resourcing reading lists takes quite alot of time and money. You probably won't get much thanks for doing it but will probably get quite alot of complaints if it isn't done. Ho hum.

Sometimes the day just gets away with you. I've been meaning to have a look at the latest edition of Journal Citation Reports for ages but still didn't get around to it today.

This is a Click Beetle (I think). It's quite freaky when one flies down and hits you in the face!

Hootsuite just keeps on getting better with Hootsuite5 being released. Now with GeoSearch, drag and drop upload and enhanced FaceBook features among other things. Oh yeah - apparently Dolly Parton uses it too.

Monday, 15 February 2010

Sources for bibliometrics in the REF

Web of Knowledge cake

This is the second of a series of posts summarising what I'm reading about the use of bibliometrics in the Research Excellence Framework. The first post looked at how influential bibliometrics are likely to be in informing the REF in the health fields.

So, working on the assumption that citation analysis is going to at least influence expert review in the health field, what are the best sources to use to find journals to publish in? As part of the REF, a lot of research has been carried out to assess the pros and cons of the various citation databases on the market.


This document: Appraisal of Citation Data Sources: A report to HEFCE by the Centre for Science and Technology Studies concludes that Scopus is a serious alternative to WoS in the science fields. There are gaps though (for example Scopus does not index journals from cover-to-cover like WoS, so there are some articles and reviews that might be indexed in WoS but are not available in Scopus). Swings and roundabouts though as Scopus is judged to be more accurate in some areas. One example is that it preserves author affiliations for all co-authors, wheras WoS only lists the institution of the first author.


There is a subject element to this as well:


"The comparison of WoS and Scopus coverage of the ‘best’ publications submitted to the 2001 RAE showed that Scopus coverage is especially better in the Subject Group Subjects allied to Health, as well as to a lesser extent also in Engineering & Computer Science and Health Sciences." (Meod and Visser 2008)

So Scopus can't be ignored - especially in the subject areas mentioned above. But I think maybe it's best not to get too hung up on the bibliometrics side of things. The latest REF guide from HEFCE clearly states that citation analysis will only be used only in those subject areas for which citation data is considered to be a strong indicator of outputs and that other methods will also be used (HEFCE 2009). It's highly likely that other indicators will be taken into account in the health sciences fields.

References:

HEFCE, 2009. The Research Excellence Framework: A brief guide to the proposals. Bristol: HEFCE. Available from: http://www.hefce.ac.uk/research/ref/resources/REFguide.pdf [Accessed 15 February 2010].

Moed, H.F. and Visser, M.S., 2008. Appraisal of Citation Data Sources: A report to HEFCE by the Centre for Science and Technology Studies, Leiden University. Bristol: HEFCE. Available from: http://www.hefce.ac.uk/pubs/rdreports/2008/rd17_08/rd17_08.doc [Accessed 15 February 2010].

Friday, 12 February 2010

Publishing for the REF?

Where Periodicals Go to Die



I'm increasingly being asked to support our staff and PhD students in the field of publishing. I guess from two angles:

1) What sources can we use to determine good publications to target - especially in the context of the REF.
2) Are there any good sources of guidance on writing for publication.

For the first I'm going have to develop my knowledge of the REF a bit. It's all very well being aware of the sources that folks can use (e.g. JCR, Web of Knowledge, Scopus) but these things need to be taken in the context of what the REF is going to be assessing. Are measures such as impact factors the 'be all and end all', or will the REF take other factors into consideration?

To help me with this I'm taking a look at this document:

HEFCE, 2009. Report on the pilot exercise to develop bibliometric indicators for the Research Excellence Framework. http://www.hefce.ac.uk/pubs/hefce/2009/09_39/#exec [Accessed 12 February 2010].

It outlines the pilot study to 'develop the bibliometrics element of the Research Excellence Framework'. I think it's worth emphasising that bibliometrics is just one element of the REF and other factors may need to be taken into account when choosing where to publish. One of the key points of the study is the conclusion that bibliometrics are not robust enough to replace expert review... but citation analysis may inform the review. Another finding is that the effectiveness of bibliometrics varies depending on your subject area (i.e. bibliometrics are going to be more effective in those subject areas that rely most heavily on journals for dissemination).

So how do you know how important bibliometrics are going to be in in informing expert review in your subject area? Annex H. of the document above is a study on the availability of citation data in different subject areas and offers some analysis of how important citations might be. For my area (Health and Social Care) it's mixed news!

‘Health sciences’ is a mixed area. Many staff are professionally engaged and used
practitioner journals not well covered by commercial data. Others are in areas that
provide excellent bibliometrics. This is therefore an area in which bibliometrics may be
seen by some commentators to provide useful information but which in practice cannot
be assumed to provide sufficient information without support from peer review. (HEFCE 2009: 154)

This suggests to me that expert reviews in health sciences will not rely solely on citations (as they recognise that many practice based researchers will be disseminating research in professional journals). I think the holy grail as far as choosing where to publish goes, will be to find journals that do well in JCR and on Scopus but are also well read by practitioners.

References

HEFCE, 2009. Report on the pilot exercise to develop bibliometric indicators for the Research Excellence Framework. Bristol: HEFCE. Available from: http://www.hefce.ac.uk/pubs/hefce/2009/09_39/#exec [Accessed 12 February 2010].