Wednesday, 22 August 2018

Libraries and free technology – Bargains to be found if you look around and avoid the pitfalls

Image of Andy Tattersall
Andy Tattersall
This post was originally written by Andy Tattersall ahead of his and fellow MmIT committee member Christina Harbour’s participation in the next #uklibchat

There is the line that you can never have too much of good thing and these days there are so many good things that librarians and information professionals can employ in their working environment. The great thing is that since we emerged from the world of Web 1.0 to 2.0 that a lot of these newer tools are free and actually quite useful. The flipside is that a lot aren’t that good or just can’t be applied in a library setting, regardless of how hard you try and knock a square peg into a round hole, it won’t go (unless the square peg is smaller of course).

Libraries are no different from any kind of organisation, they have to use formally licensed software for the day to day running of their service. Even though this does not always mean the leanest or most dynamic of packages serving your library, but it does mean you will get a good level of service support and that is essential. The smaller, more niche tools have a part to play in this technology ecosystem - just like the microbes and bugs on Planet Earth - if we remove them the whole system would collapse. The larger technology companies often need the smaller companies to keep the environment from becoming stale and predictable. They also can eat them up from time to time, just like our bugs and other real world creatures. Take for example how - at the time independent company - Mendeley changed reference management dramatically for the better. The smaller technology companies are less likely to get bogged down by bloated platforms run by large companies who focus first on foremost in delivering a stable product for their users. Like I say, the stability of large platforms is essential, the flexibility and dynamic nature of smaller technologies is often where the real action is at.
Image of uklibchat logo
The last ten years has seen a tremendous growth in new technologies that can be applied in a library setting. The financial cost of these tools, such as Canva, Twitter, Adobe Spark and Eventbrite can be free. Yet with freedom can come a cost as problems can start to float to the surface, although not all of these problems are that worrisome. The old adage ‘If you are not paying for the product - you are the product’ certainly rings true with how some technologies will give you a free ride if you give them your data in return. There are also issues around what do you do when you become hooked into a useful platform, but want more from the premium add ons and the person holding the purse strings says no. How do you know whether the tool you are using will be here tomorrow - remember PageFlakes, Storify, Readability, Google Reader and Silk anyone?

Another question for the typical library or information professional is which tools are best and how can they be applied and which will work on their system - take for example a librarian in an NHS setting. The final and most crucial issue is around the investment of time used to master new tools and that can be problematic depending on the learning curve, but if you know how to use Microsoft Word you’ll probably master most lightweight tools in very little time. The sheer number of tools that can be used in the library sector is overwhelming, regardless of whether you are a public, NHS, business or academic librarian. One tool may solve a host of problems for one librarian but be as useful as a chocolate teapot for another. It is all about application and one of the greatest things to see in technology uptake in the library is how one person can use a tool and then another take that same tool and apply it in a totally unexpected way just as successfully. This is the wonderful thing about these technologies, whether it is Menitmeter for polling, Pocket for curating or Piktochart for posters, you you use it may be totally different from how someone else does.

Monday, 20 August 2018

Academic Writing: Less Pain, More Joy


As I am coming to the end of a part-time EdD, I am feeling the pain of bringing all my chapters up to standard, and creating a whole thesis, rather than a book of stand-alone pieces. The analogy of assembling a wheelbarrow rings true: first make sure all the parts are there, then tighten it up (Wolcott, 1990). Through this final bit, I’m going to have to draw on everything I’ve learnt about writing in order to get to the finish line. I thought I’d share some of my learning with you, in case you find it useful in your situation.

It seems to me that when sitting down to write there are three usual scenarios:

  1. You can’t settle, it’s boring, you don’t want to do any work because it’s the weekend and you’re tired, and everyone else in the universe is having fun. ("Maungy", as we say in Yorkshire)
  2. You have tried numerous ways of structuring something and ways to generate ideas, but you are getting nowhere. (Thwarted)
  3. Engagement, flow of ideas, productivity, happiness, the sun dappling on the wall as your half-drunk tea starts to go cold. (Joy)

Scenario one, where you are distracted, needs a strict approach, which will lead to Scenario three (Joy). How restrictive you need to be depends upon how bad your state is.  If it is acute, then it might be a scenario this bad:

You know that in order to start you need to find a reference and extract the ideas from it to get going, but you have been faffing about in guilty misery for ages. So, face it square on, bribing yourself that if you find the article, you can momentarily think about something else. Once you have found the article, and get into the ideas it should get better. Continue with tiny, tiny tasks and commensurate rewards.

Other Practical Ways to a Resolution

My colleague Andy Tattersall reminded me about using a table for writing my discussion chapter. It proved a life-line in finding a way to speed up the writing process, by splitting the job of writing the discussion into two processes (once I’d got over Scenario two). Other ways to tackle Scenario one are: to use the pomodoro technique, a writing retreat, or split the day/session into quadrants and allocate a task for each one, factoring in mini and larger breaks.

Scenario two, where you are working but getting nowhere - in my experience this can go on for ages (days or weeks)  and is equally awful, but in a different way to Scenario one. There are lots of solutions to this one as well, and come in two types of not-mind blowing strategy.

Get structural examples

Look at other theses and see how the authors have structured the chapter you are working on. This can reveal: what sections are most commonly included, how long they give to each section, the quality of the writing, a laugh, lots of things. Compare a few and a picture starts to build. Another one is to read a few blogs / sections of methods textbooks / talk to other people writing the same kind of thing / speak to your supervisor. In all these approaches, ask specific questions, and seek specific information. Then leave it all to percolate, which brings me to the second part of how to respond to scenario two:

Walk away, leave it alone and go and do some exercise, go to church, play the saxophone or whatever floats your boat. It’s well known that if forced, tacit ideas retreat, but leaving them alone and re-acquainting yourself with your family and friends or getting absorbed in a film might just do the trick. Next time you write, Scenario two might have shifted, revealing a structure that is working; and once you have that, you know it, increasing your confidence and leading you onwards to Scenario three: writing joy and a less irascible you (for now).

Of course the challenge is to diagnose when it really is Scenario two (thwarted), and not just taking off because you are hurting with the pain of scenario one (Ms Maungy).

Picture of a cat screaming
Image by Mingo Hagan “Scream”
Attribution 2.0 Generic (CC BY 2.0)

Happy writing, and if all else fails, try writing about why you can’t write.

Monday, 30 July 2018

In the era of Brexit and fake news, scientists need to embrace social media

Image of Andy Tattersall
Andy Tattersall
Social media can be an intimidating place for academics as not all of them take to it like ducks to water. For many newcomers, a more appropriate analogy is a newborn giraffe - clumsy, awkward and vulnerable to prey.

After all, researchers are employed to win bids, publish research and get cited. Since most of this happens behind closed doors or within circles exclusive to the academic community, the open forum of social media can seem like a distraction from the real work.
However, for those willing to make the leap, research suggests that once academics surpass 1,000 followers on Twitter there is an appreciable increase in the diversity of the audiences they reach with their work.

Communicating with people outside of academia means reaching those who might directly benefit from the research. These individuals and groups can then help shape future research aims and give useful feedback to scientists. Still, many academics remain reluctant. There is no clear evidence that social media generates research impact that is beneficial to society, culture and the economy or at least it is very hard to measure.
Some academics have even lost tenure as a result of their behaviour on Twitter, while others have tried to disguise their limited expertise by building a reputation for authority online. With mounting pressure on the time of academics, social media can seem like it isn’t worth the effort.

Despite this, research shows that there is growing curiosity among scholars to use social media in their work, but to sustain this interest there needs to be clearer evidence of the benefits. In the age of Brexit and fake news, social media is more important to academia than ever before.
File 20180717 44100 otiedr.jpg?ixlib=rb 1.1
Scientists: your social media platforms need you!

A virtual bridge with the EU after Brexit

Brexit has sown uncertainty in British universities among staff who are from the EU. In other sectors, such as the NHS, anxiety over the result has caused a fall in the number of trained nurses coming from the EU to work in the UK. British academics projected across social media could provide reassurance and support to international colleagues who have increasingly felt they are facing an uncertain future in the UK. Without more academics joining Twitter and other platforms, social media will continue to carry the voices of those who shout loudest. As a result, some of the biggest mouths deliver unwelcome messages to European colleagues who have built careers, homes and families in the UK.

No one truly knows what will happen in March 2019 when the UK formally leaves the EU, or if that will even happen. Social media presents a way of staying in touch with academics from across the Channel in any case, and allows people to stay abreast of new research, ideas and opportunities with European counterparts.

Academics will continue to communicate and collaborate on research after March 2019, but potentially not in the way that they have in the past. We do not know how Brexit will affect travel between the UK and the EU, but blogging and social media could promote openness in research that will bridge the divide left by tightening freedom of movement.

The fight against fake news

As crude a term as it is, fake news is a threat to the principles of rigorous investigation that academia embodies. In the United States, the suppression of experts and their data by the Trump administration highlights the risks of scientists remaining silent and not using social media channels to challenge misinformation.

In this “post-truth” world, we have often heard that people no longer wish to hear from experts. This shift was captured, again, by the Trump administration and their failure to appoint a scientific adviser to the White House.

Image of Protesters challenge the suppression of climate change research by President Trump.
Protesters challenge the suppression of climate change research by President Trump.

Of course, experts do get things wrong on occasions, but most people surely would rather a qualified pilot flew their plane than an amateur with opinions on aviation. Academics communicating their findings and ideas on social media platforms can attempt to address the balance that has shifted towards ill-evidenced news on these sites.
Improving working relationships with journalists can also ensure that stories shared online have links to open access versions of the research, so that science news is more easily checked for accuracy and properly credited to the original scientists.
The ConversationThe current moment and media climate may appear unfriendly to academia, but that is all the more reason for researchers to seize the initiative and reset the debate on their terms.

Andy Tattersall, Information Specialist, University of Sheffield
This article was originally published on The Conversation. Read the original article.

Wednesday, 18 July 2018

Nothing lasts forever: questions to ask yourself when choosing a new tool or technology for research

Image of Andy Tattersall
Andy Tattersall
Academia has become increasingly reliant on third-party tools and technologies to carry out many of the processes throughout the research lifecycle. But there are genuine concerns about the sustainability of some of these tools and what the implications would be for users in the event they were discontinued. Andy Tattersall suggests a series of straightforward questions researchers should ask themselves before choosing a new technology for use in their research. Can you export your content? Is there an alternative? After all, there is no guarantee your favourite tool will still be around tomorrow.
Academia has not always been good at adopting new technologies to aid research and teaching. Even a tool as seemingly popular and simple to use as Twitter has been received with some anxiety and trepidation within the scholarly community. There are various reasons for the slow uptake of new technologies, something not exclusive to the academic community, as captured in Everett Roger’s Diffusion of Innovations. Technology continually changes and the pressures of keeping up with it can actually cause inertia and some to bury their heads in the sand rather than engage with the changing environment. There are genuine concerns about the sustainability of tools we rely on in the academic community, with no guarantee that popular tools like Google Scholar or Twitter will be with us this time next year.

Adopting technologies that eventually cease business

There are several examples of really useful tools to have been accepted by the academic community only to pull down the virtual shutters for good. It can be quite depressing to have invested time and energy in mastering a tool only for it to disappear offline. This may happen for a variety of reasons, such as a lack of investment (financial or development), slow uptake, or the founding individual moving onto a new venture. Those in academia want solid, factual reasons to utilise a new tool; if the one they currently use works fine, why switch to another they haven’t heard of? It can be like the problem of buying a new laptop: why purchase one now when you could buy one with double the processing power for the same price a year later? Sadly that attitude means you end up not moving on at all. Academia is about finding answers to problems and learning from previous mistakes – surely the same should apply to the very tools we use to achieve better outcomes?
There are several issues around adopting technologies to carry out, communicate, and analyse research, issues further complicated by the duplication of platforms or providers’ expansions into new areas of business. Take Mendeley, for example, which started as a social network and reference management tool but has since expanded into a data-hosting and a funding-search service.

The sad demise of useful platforms

Google Reader, PageFlakes, Readability, Silk and Storify have all ceased business in recent years despite demand for their services. In some cases this can be problematic for users as they have invested great amounts of time in curating their own content, particularly so in the case of personalised dashboard PageFlakes or data visualisation site Silk. Thankfully, for most of the aforementioned tools there were suitable alternatives and useful sites like alternativeTo, which directs users to similar options. In some cases the provider itself even pointed towards an alternative, such as Readability which used its front page to direct users to Mercury Reader. Others such as Storify proved more problematic, with no immediate like-for-like tool obviously available and Wakelet seeming the best alternative.

Choosing the right tool for the job

For anyone working with academics to adopt new tools, or for those more proactive academics wishing to explore new ways of working, there are several questions you should ask before adopting a new technology. For the most part these are straightforward and it is important to remember you may only use some technologies once.
  • Is it intuitive to use?
  • Is there an alternative?
  • Can you export your content?
  • What are they doing with your data?
  • How often will you use the technology?
  • Do you know anyone using this tool already?
  • Has the technology been around for long?
  • Who created the technology and who owns it?
  • Are the developers on social media and how often do they post new updates?

Nothing lasts forever

Academia is becoming increasingly reliant on technology, especially third-party tools, to carry out certain research processes. This has long been the case, with tools such as Dropbox or YouTube offering more functionality than in-house institutional platforms. With more tools comes greater diversity and potentially more problems. There is no guarantee we won’t see another crash like that of 2000, and this time academia would also feel its wrath. Many platforms, especially niche academic ones, are run by just a handful of staff or even students. They may have investors expecting a return on their capital, families with mouths to feed, or office bills to pay.
Another strand to this debate is the thorny subject of open-source versus profit-driven platforms within scholarly communications, as discussed in previous posts by Jefferson Pooley and Mark Hahnel. Some academics may prefer the open, community-driven nature of open-source technologies, believing these to be more aligned with core academic values. Yet rejecting all commercial platforms could mean cutting off your nose to spite your face, with open-source initiatives often hamstrung by technical and financial constraints that make them unsustainable.
Academia’s increasing reliance on these platforms to undertake a multitude of tasks – including carrying out, communicating, and measuring research and its impact – requires greater dialogue around sustainability. It is likely that popular third-party platforms used by the academic community such as Twitter, Facebook, Slideshare, Google Scholar, and YouTube will be here for some time. But what about the smaller niche tools that have been essential in changing and enhancing how academics carry out their work? One only has to look at Google Reader, PageFlakes, and the many others that are no longer in existence. Academia needs to be flexible and adaptable to the changes brought on by the shifting sands of technology but also pay attention to the tools you love the most but which might not be around tomorrow.
Originally published on the LSE Impact of Social Sciences Blog
This work is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated.

Wednesday, 6 June 2018

New research needs to be better reported and librarians can help with that

Image of Andy Tattersall
Andy Tattersall

I delivered a talk at the 'This is not a Fake News  Conference' hosted by the London South Bank University on June 5th. The conference was aimed at librarians and information professionals but also had members of the academic publishing sector, journalists and academics in attendance. My slides and abstract are below and you can read more about the conference here

New research needs to be better reported and librarians can help with that

Scientific research is increasingly being given coverage and attention in the media. The problem is that the media often fail to acknowledge who actually carried out the research and link to a publicly available version of that work or institute. This can lead to misreporting (sometimes intentional) and biased news coverage. Whilst academics, collaborators and institutions do not get the credit they deserve. As the REF and impact agenda become increasingly more important, so does the accurate reporting and collection of such impact, through such as altmetrics and media monitoring. Without citing and linking back to the work it becomes harder to track as a story takes on a life of its own through social media and reposts. Linking to the research makes it harder to misreport or cherry pick facts and stats as interested parties are able to check the facts for themselves. At a time when we have been told ‘people have had enough of experts’ and world leaders denouncing scientific fact, proper and accurate reporting of research has never mattered more. There are a few important things librarians can do to support the better reporting of research through encouraging linking to the open access versions and exploring how research is received through altmetrics. This talk will explore the issue and what can be done to tackle it.

Monday, 4 June 2018

Andy Tattersall and Mark Clowes write for the latest issue of MmIT Journal

Image of Andy Tattersall
Andy Tattersall
Image of Mark Clowes
Mark Clowes
Andy Tattersall and Mark Clowes have written articles for the latest edition of the MmIT Journal. Andy Tattersall has penned a piece on using the superb software Adobe Spark to create short animated videos; whilst he has co-authored a piece with Mark on their work in setting up a pop up radio station to support the Sheffield Based charity Inspiration for Life and their yearly 24 Hour Inspire event. 

Both articles can be read in the latest issue online. MmIT is the open access journal for the Cilip special interest group Multimedia Information Technology. You can find out more about MmIT here. 

Thursday, 17 May 2018

Andy Tattersall to deliver Keynote at the Business Librarians Association Conference

Information Specialist Andy Tattersall is one of the three keynotes at this year's BLA Conference taking place at Swansea University. The three day conference takes place from 27th-29th June with the theme 'Making Waves'. Andy will be delivering his keynote on the 28th with a talk titled; "Staying afloat in a sea of technological change". 

The other two keynote speakers are 
Michael Draper, Associate Professor in Law, Swansea University and Professor Sally Bradley, Academic Lead in Accreditation, Award and Recognition, HEA and Professor, Sheffield Hallam University.
The conference web page and booking details can be viewed here

Thursday, 19 April 2018

Andy Tattersall's talk on Altmetrics at The British Psychological Society Research Day

Andy Tattersall was invited to give a talk about Altmetrics at The British Psychological Society Research Day held at the impressive Senate House Library in March. The recording of the talk and his participation in the final panel discussion can be viewed below. the slides are also at the end of this blog post.

Tuesday, 10 April 2018

Many a true word is spoken in jest, part two: more social media content that mocks, self-ridicules, and brings a smile to academia

Image of Andy Tattersall
Andy Tattersall
Two years ago, Andy Tattersall highlighted those Twitter accounts that offered some light relief from the often all-too-serious world of academia. This 2018 instalment includes an account “sadly” overlooked last time, as well as moving beyond the Twittersphere to share some the best memes, videos, and more to provide sharp commentary on peer review, academic advisors, and altmetrics.

In April 2016 I wrote about the growing number of parody Twitter accounts that take the best and worst of academia and serve it up as a comedy dish. As the title suggests, many a true word is spoken in jest but we all know that just below the surface lie the real home truths of our industry. The problem, however, for many academics trying to be “witty”, is that they can fall flat on their face. I thought it would be good to visit some of the other tongue-in-cheek academic excursions that capture the weird and wonderful within academia.

When I wrote the first post it was solely focused on the Twitter community, and sadly neglected to include one of the scholarly Twitterati’s most vocal protagonists - @ScientistTrump. When my post went live I was flattered to receive a tweet from Donald Trump, PhD calling my piece “biased” as it had not included him - he even concluded his tweet with one of the real Donald Trump’s trademark sign-offs: “SAD”. Thankfully the Trump obsession with fake news was not yet in full flow, but I am sure the post and the LSE Impact Blog would have been labelled as such. Whoever is behind this great account - and it is the greatest scientific Twitter account - has expanded to a full website and a forthcoming web store. Not wanting to inflate that already fully blown narcissistic ego any more, but the tweets are that of a very stable genius and reflect the kind of communication that is typical of President Trump but with a scientific slant applied. For example, in December Donald Trump, PhD proudly reported:
His supporters will no doubt still be keen to see that wall built to ensure academic literature stays out of the public domain.

Given the daily communications coming out of the White House, it is not hard to satirise the 45th President of the United States. Putting an academic spin on The Donald is not so easy but Psychologist Matt Crawford made a good go of it with a fictional paper he published. The paper titled “A title for a really great piece of research, just the best, really” is full of classic Trump boasts, so much so that you will hear Donald’s voice inside your head as you read it.

Donald Trump’s tweets might make you feel outraged, but imagine how your social media stream would have looked with Hitler kicking and screaming across the web? Putting an academic slant on it, how would he have dealt with scientific peer review? Thankfully someone took that much-parodied scene in Hitler’s bunker from the film Downfall and re-subtitled it to show how Hitler would have responded to negative comments from the third reviewer. After a raging tirade, the Führer concedes that maybe he should just submit to one of those new “open access” journals.

Captioned images shared across the web, better known as memes, also offer much light-hearted humour that only those within academia will truly get. Some of the sharpest include the popular memes Boromir (Lord of the rings) and Willy Wonka alongside the tweets from Research Wahlberg and the Hey Girl. I like the library too blog.

A personal favourite comes from that most cosmic of sages, Yoda:

Some of the finest moments can be found by searching “academic meme”s on Google Image Search or Pinterest.

Every institution has professors who are dapper in their fashion choices and those who look like they have crawled out of a hedge before heading into work. Prof or Hobo tests your ability to spot the professors from the tramps. I was made aware of the quiz by a professor in reference to one of his peers who proudly wears his dishevelled look as a badge of honour, actively trying his best to look like he lost a fight with a bear. The site features ten images and for each you simply have to choose whether the man in question is a professor or a hobo. Just remember that looks can be deceiving.

Whilst we are on the topic of chairs, there are also the kind you sit on to conduct your research. In case you wondered what happened to them after they were retired from duty, they appear on the Sad Chairs of Academia blog. Before they are dispatched to that great office in the sky, they are captured for one last time for this most surreal of blogs. I’m waiting for the best images to be compiled into a 2019 calendar.

Metrics and social media are never are far away from academic discussion, and both are valuable tools in communicating and gauging interest in a piece of research. The two are combined perfectly to calculate the satirical Kardashian Index where a scientist’s citations are compared to followers on Twitter. Of course citations and Twitter followers are no true measure as to a researcher’s true worth, but those with a high Kardashian Index score could indicate popularity over productivity. We are eagerly awaiting the Kanye West Index.

For most publishing in the academic sphere, you will no doubt receive regular invites to write for predatory journals. Whilst this issue becomes increasingly problematic there are a few things you can do to tackle these charlatans whilst also having a bit of fun. One idea is to use the tool Re:Scam which is part of the New Zealand online safety website Netsafe. This tool bounces replies back to scam emailers and keeps them tied up with computer-generated emails. Whether this will work with those actually sending out the phishing messages will be hard to tell, but it’s certainly worth a try in case any are bots. If that fails you can do as I did (in my lunch break) after receiving several requests to publish in a dubious fisheries and agriculture journal. I sent them a PDF formatted manuscript with the word “fish” repeated 6000 times, with a few fishy references to Jacques Cousteau and Michael Fish thrown in too, in addition to a table of different fish. For some reason they did not accept. Nor did they ever contact me again. Funny that.

The article originally appeared on the LSE Impact of Social Sciences Blog

Wednesday, 28 March 2018

Calling Australia!

Members of the Information Resources team recently hosted an online course for librarians based in Australia.   Led by Anthea Sutton, the FOLIO programme has been delivering web-based CPD courses to library and information professionals for over a decade.

Recently, FOLIOz (see what we did there?) has been partnering with ALIA, the Australian Library and Information Association to offer bespoke training catering for the needs identified by its members.

For the latest course, on Evidence-Based Library and Information Practice (EBLIP for short), Anthea was joined by a small team including Andrew Booth, Helen Buckley Woods and Mark Clowes to design and deliver the course content (which included video lectures, readings and assessed course work); as well as facilitating the group discussion boards and hosting two live webinars (a particular challenge given the time difference between ourselves in the UK and our participants "down under").   We were also delighted to welcome Professor Alison Brettle (from Salford University) to deliver a guest lecture on the future of EBLIP.

The course attracted participants from a range of sectors, including education and public libraries as well as from health - all keen to apply an evidence-based approach to solving problems and achieving best practice in the settings of their different services.

As one delegate commented: "This course is right on point as far as the skills I need to develop so our unit can reach its goals."

If you are interested in discussing how FOLIO could help with the training needs of your library/information team, please get in touch with us at

Wednesday, 21 March 2018

What Can Tell Us About Policy Citations of Research? An Analysis of Data for Research Articles from the University of Sheffield

Image of Andy Tattersall
Andy Tattersall
Image of Chris Carroll
Chris Carroll              
Andy Tattersall (ScHARR Information Resources) and Dr Chris Carroll (ScHARR Health Economics and Decision Science) have published a new paper in Frontiers in Research Metrics and Analytics. The paper looked at published University of Sheffield research and what the data says about the impact of its research on national and international policy. The percentage of outputs with at least one policy mention compares favourably with previous studies, while huge variations were found between the time of publication and the time of the first policy citation. However, some problems with the quality of the data were identified, highlighting the need for careful scrutiny and corroboration.
Altmetrics offers all kinds of insights into how a piece of research has been communicated and cited. In 2014 added policy document tracking to its sources of attention, offering another valuable insight into how research outputs are used post-publication. At the University of Sheffield we thought it would be useful to explore the data for policy document citations to see what impact our work is having on national and international policy.

We analysed all published research from authors at the University of Sheffield indexed in the database; a total of 96,550 research outputs, of which we were able to identify 1,463 pieces of published research cited between one and 13 times in policy. This represented 0.65% of our research outputs. Of these 1,463 artefacts, 21 were cited in five or more policy documents, with the vast majority – 1,185 documents – having been cited just once. Our sample compared very well with previous studies by Haunschild and Bornmann, who looked at papers indexed in Web of Science and found 0.5% were cited in policy, and Bornmann, Haunschild and Marx, who found 1.2% of climate change research publications with at least one policy mention. From our sample we found 92 research articles cited in three or more policy documents. Of those 92 we found medicine, dentistry, and health had the greatest policy impact, followed by social science and pure science.

We also wanted to explore whether research published by the University of Sheffield had a limited time span between publication and policy citation. We looked at the time lag and found it ranged from just three months to 31 years. This highlighted a long tail of publications influencing policy, something we would have struggled to identify prior to without manual trawling. The earliest piece of research from our sample to be cited in policy was published in 1979 and took until 2010 before receiving its first policy citation. We manually checked the records as we found many pre-1979 publications to have been published much later, often this century. This is likely due to misreported data in the institutional dataset, giving a false date; highlighting the need to manually check such records for authenticity. The shortest time between research publication and policy citation was a mere three months: a paper published in November 2016 and first cited in National Institute for Health and Care Excellence (NICE) policy in January 2017.

The reports are only as good as the data they analyse and our research did uncover some errors. Looking at those 21 papers with more than five policy document citations, we found seven were not fit for inclusion. One such example was identified when we discovered research papers had been attributed to the University of Sheffield when the authors were not, in fact, affiliated to the university. As this data is sourced from our research publications system, we assume this was a mistake made by the author; this can happen when authors incorrectly accept as their own papers suggested to them by the system. While this was almost certainly a genuine error, and may have been rectified later, the system had not yet updated to take account of such corrections. Another of these papers was mistakenly attributed to an author who had no direct involvement in the paper but who was part of a related wider research project. Another of the publications was excluded due to it not, in fact, having actually been cited in the relevant policy document. One of the papers that was included belonged to an author not at Sheffield at the time of publication, but who has since joined the institution. This showed that’s regular updates were able to discover updated institutional information and realign authors with their current employer.

The two most cited papers came from our own department, the School of Health and Related Research (ScHARR), in the field of health economics. Only two of the 14 most cited publications were in a field other than health economics or pure economics, both of which were in environmental studies. In total, the 14 most cited research outputs were cited by 175 policy documents, but we identified 9% (16) of these as duplicates. Of those 175 citations we found that 61% (107) were national, i.e. from the UK, and 39% (68) were international, i.e. from countries other than the UK or from international bodies such as the United Nations or World Health Organization. continues to add further policy sources to its database to trawl for citations. As a result, it should follow that our sample of 1,463 research outputs will not only grow with more fresh policy citations, but as older research citations are identified through new policy sources of attention. This work also highlights the importance of research outputs having unique identifiers so they can be tracked through altmetric platforms; it is certain that more of our research will be cited in policy, but if no unique identifier is attached, especially to older outputs, it is unlikely the system will pick it up. is a very useful indicator of interest in and influence of research within global policy. Yet there are clearly problems with the quality of the data and how it is attributed to subsequent data. We found one third of our sample of the 21 most cited research outputs had been erroneously attributed to an institution or author. Whether this is representative of the whole dataset only further studies will find out. Therefore it is essential that any future explorations of research outputs and policy document citations be double-checked and not taken on face value.

This blog post is based on the authors’ article, “What Can Tell Us About Policy Citations of Research? An Analysis of Data for Research Articles from the University of Sheffield”, published in Frontiers in Research Metrics and Analytics (DOI: 10.3389/frma.2017.00009).

The blog post was originally written for the LSE Impact Blog and is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated. The original article appears here 
Creative Commons Licence