Wednesday, 28 March 2018

Calling Australia!

Members of the Information Resources team recently hosted an online course for librarians based in Australia.   Led by Anthea Sutton, the FOLIO programme has been delivering web-based CPD courses to library and information professionals for over a decade.

Recently, FOLIOz (see what we did there?) has been partnering with ALIA, the Australian Library and Information Association to offer bespoke training catering for the needs identified by its members.

For the latest course, on Evidence-Based Library and Information Practice (EBLIP for short), Anthea was joined by a small team including Andrew Booth, Helen Buckley Woods and Mark Clowes to design and deliver the course content (which included video lectures, readings and assessed course work); as well as facilitating the group discussion boards and hosting two live webinars (a particular challenge given the time difference between ourselves in the UK and our participants "down under").   We were also delighted to welcome Professor Alison Brettle (from Salford University) to deliver a guest lecture on the future of EBLIP.

The course attracted participants from a range of sectors, including education and public libraries as well as from health - all keen to apply an evidence-based approach to solving problems and achieving best practice in the settings of their different services.

As one delegate commented: "This course is right on point as far as the skills I need to develop so our unit can reach its goals."


If you are interested in discussing how FOLIO could help with the training needs of your library/information team, please get in touch with us at folio@sheffield.ac.uk










Wednesday, 21 March 2018

What Can Altmetric.com Tell Us About Policy Citations of Research? An Analysis of Altmetric.com Data for Research Articles from the University of Sheffield

Image of Andy Tattersall
Andy Tattersall
Image of Chris Carroll
Chris Carroll              
Andy Tattersall (ScHARR Information Resources) and Dr Chris Carroll (ScHARR Health Economics and Decision Science) have published a new paper in Frontiers in Research Metrics and Analytics. The paper looked at published University of Sheffield research and what the data says about the impact of its research on national and international policy. The percentage of outputs with at least one policy mention compares favourably with previous studies, while huge variations were found between the time of publication and the time of the first policy citation. However, some problems with the quality of the data were identified, highlighting the need for careful scrutiny and corroboration.
 
Altmetrics offers all kinds of insights into how a piece of research has been communicated and cited. In 2014 Altmetric.com added policy document tracking to its sources of attention, offering another valuable insight into how research outputs are used post-publication. At the University of Sheffield we thought it would be useful to explore the Altmetric.com data for policy document citations to see what impact our work is having on national and international policy.

We analysed all published research from authors at the University of Sheffield indexed in the Altmetric.com database; a total of 96,550 research outputs, of which we were able to identify 1,463 pieces of published research cited between one and 13 times in policy. This represented 0.65% of our research outputs. Of these 1,463 artefacts, 21 were cited in five or more policy documents, with the vast majority – 1,185 documents – having been cited just once. Our sample compared very well with previous studies by Haunschild and Bornmann, who looked at papers indexed in Web of Science and found 0.5% were cited in policy, and Bornmann, Haunschild and Marx, who found 1.2% of climate change research publications with at least one policy mention. From our sample we found 92 research articles cited in three or more policy documents. Of those 92 we found medicine, dentistry, and health had the greatest policy impact, followed by social science and pure science.

We also wanted to explore whether research published by the University of Sheffield had a limited time span between publication and policy citation. We looked at the time lag and found it ranged from just three months to 31 years. This highlighted a long tail of publications influencing policy, something we would have struggled to identify prior to Altmetric.com without manual trawling. The earliest piece of research from our sample to be cited in policy was published in 1979 and took until 2010 before receiving its first policy citation. We manually checked the records as we found many pre-1979 publications to have been published much later, often this century. This is likely due to misreported data in the institutional dataset, giving a false date; highlighting the need to manually check such records for authenticity. The shortest time between research publication and policy citation was a mere three months: a paper published in November 2016 and first cited in National Institute for Health and Care Excellence (NICE) policy in January 2017.

The Altmetric.com reports are only as good as the data they analyse and our research did uncover some errors. Looking at those 21 papers with more than five policy document citations, we found seven were not fit for inclusion. One such example was identified when we discovered research papers had been attributed to the University of Sheffield when the authors were not, in fact, affiliated to the university. As this data is sourced from our research publications system, we assume this was a mistake made by the author; this can happen when authors incorrectly accept as their own papers suggested to them by the system. While this was almost certainly a genuine error, and may have been rectified later, the system had not yet updated to take account of such corrections. Another of these papers was mistakenly attributed to an author who had no direct involvement in the paper but who was part of a related wider research project. Another of the publications was excluded due to it not, in fact, having actually been cited in the relevant policy document. One of the papers that was included belonged to an author not at Sheffield at the time of publication, but who has since joined the institution. This showed that Altmetric.com’s regular updates were able to discover updated institutional information and realign authors with their current employer.

The two most cited papers came from our own department, the School of Health and Related Research (ScHARR), in the field of health economics. Only two of the 14 most cited publications were in a field other than health economics or pure economics, both of which were in environmental studies. In total, the 14 most cited research outputs were cited by 175 policy documents, but we identified 9% (16) of these as duplicates. Of those 175 citations we found that 61% (107) were national, i.e. from the UK, and 39% (68) were international, i.e. from countries other than the UK or from international bodies such as the United Nations or World Health Organization.

Altmetric.com continues to add further policy sources to its database to trawl for citations. As a result, it should follow that our sample of 1,463 research outputs will not only grow with more fresh policy citations, but as older research citations are identified through new policy sources of attention. This work also highlights the importance of research outputs having unique identifiers so they can be tracked through altmetric platforms; it is certain that more of our research will be cited in policy, but if no unique identifier is attached, especially to older outputs, it is unlikely the Altmetric.com system will pick it up.
Altmetric.com is a very useful indicator of interest in and influence of research within global policy. Yet there are clearly problems with the quality of the data and how it is attributed to subsequent Altmetric.com data. We found one third of our sample of the 21 most cited research outputs had been erroneously attributed to an institution or author. Whether this is representative of the whole dataset only further studies will find out. Therefore it is essential that any future explorations of research outputs and policy document citations be double-checked and not taken on face value.

This blog post is based on the authors’ article, “What Can Altmetric.com Tell Us About Policy Citations of Research? An Analysis of Altmetric.com Data for Research Articles from the University of Sheffield”, published in Frontiers in Research Metrics and Analytics (DOI: 10.3389/frma.2017.00009).

The blog post was originally written for the LSE Impact Blog and is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated. The original article appears here 
 
Creative Commons Licence