Established in 1994 ScHARR's Information Resources team has established itself as a key national player in providing information support to health technology assessment and health services research. The team is made up of professional, highly trained Information Specialists who are involved in the forefront of research, teaching, support and development. This is our blog where we talk about the diverse work we do: #Teach #Research #Search #Support
Andy Tattersall has created a new series of short instructional videos to help you make the most out of using Mendeley. The series of 15 videos are called 'Mendeley Masterclass' and help users find their way through the various web, desktop and mobile versions. More videos will be added a later date but you can see for yourself a couple below. Mendeley is a very worthwhile and simple tool to use and really helps academics and students save time and be organised whilst studying and conducting research. The videos are only a few minutes long, so with a cup of tea and half an hour you could master the software easily. The videos are on YoyTube and will be added to the University of Sheffield iTunes U collection later this year.
The problem many detractors have with altmetrics as a concept is that it seems heavily focused on numbers that may or may not be meaningful. Andy Tattersall sees this as a legitimate concern but argues researchers should consider further what can be gained from these scores, or indeed, the lack of one. In a world increasingly governed by impact and the dissemination of your research, the straight flat zero indicates an opportunity and a possible need to communicate your work.
A lot has been written in the last couple of years about altmetrics and the score that comes with them. Whether that be the Altmetric.com, ResearchGate or Kudos’ score to name but a few. Some of the tools focus in different areas with Altmetric.com being one that tries to capture a broad range of data from scholarly and public communications. With that comes their own Altmetric.com score that is weighed depending on what platform was used. For example, a Tweet is worth one point, a blog post five and a news article eight. Hence with so many of these metrics, including traditional ones like the impact factor score, h-index and citation count, the bigger the number the better. With Altmetric.com that may be good but not wholly useful, as small numbers, especially 0 can tell us a lot too.
The real value in altmetrics does not come from the score but that it measures previously ignored research outputs, such as individual papers and datasets. Also it shows us where these outputs are being communicated or in the case of an altmetric score of 0 – not communicated. The score of zero is to some extent more important than 50, 100 and higher. It tells us that this research has not been shared, discussed, saved or covered in the media. In a world increasingly governed by impact, scholarly communication and dissemination of your research, the straight flat zero indicates a possible need to communicate your work. Of course detractors of such systems will point to such as the Kardashian index and that science is not about popularity, or the ability to communicate beyond a niche group. It is about the ability to complete rigorous and quality research: ie research that is captured in journals, repositories, data management systems and shared at conferences. Yet when so many systems for global scholarly communication exist, why not use them?
Image credit: Håkan Dahlström Photography Flickr CC BY
Given that many research papers are never cited then it should follow that they will never be Tweeted, shared, blogged or saved in Mendeley. Yet as the research world increasingly uses social media to communicate with the wider world, whether that be publishers, charities, funders or the general public, the ease in which academics can communicate their work is apparent. If a researcher’s altmetric score is 0 it may seem depressing to think no one has shared or communicate their work but it does offer them a starting place in this new world. Unlike citations, it is an instant feedback loop, if you want to act upon that remains your choice.
Whilst critics may be wary of gaming altmetrics scores, and rightly so, the number 0 tells us something potentially important. That either no one knows your research exists and are yet to discover it or that sadly no one is interested in it. Obviously we cannot say this for sure, as we are just talking about active participants on the web, whether that be a discussion forum, blog, news or social media. There are many academics not engaged on the social web who one day with the aid of a literature search or conference presentation will discover your work.
At least with the altmetric score of 0 you can only go up, no one can get a negative altmetric score. So this means investigating who and where to share your research. The problem detractors have with altmetrics is that they are concerned we are focusing just on the numbers. It is a legitimate concern, how many Tweets your paper gets is not an indication it is a good piece of research. Yet that has always been the case, a high number of citations has not always indicated high quality research. The chances are that it is a good piece of research, but we can take nothing for granted in academia. If we were to put it into a sporting context and cricket, having the highest batting average or scoring the most runs in a team has never been an indicator of the best player. Given that it does give us insight that we are looking at some of the best of the bunch, it is merely a useful indicator. As with altmetrics, this is what they are, indicators of communication and interest of varying levels. So the concern is that funders, managers, journals might start to pay too much attention to big numbers. This in turn might cause some to gameplay to increase those numbers, but that was always the case pre-altmetrics. Journal editors have been known to ask authors to cite papers from their own publications and it’s not unheard for authors to self-cite.
Whilst some of this might sound like counter arguments to altmetrics, they are not. We do need to have discussions about what we want from altmetrics. Many academics would be lying if they denied they were not interested in where their research was being discussed on the web. The useful by-product from altmetrics is that we have a much better idea if our research is not being discussed at all. The score 0 may be as significant as 1000 for some academics as it tells them that no one is talking about their research.
The bigger problem is that it has become confusing when there are several platforms that generate their own metrics, Altmetric.com, ResearchGate and more public tools such as Klout have their own scores. It could start to feel like the early days of Ebay before commercial companies set up profiles and generating a 100% feedback score became paramount. These days it is not so important, big feedback scores on ebay mean nothing more than they have sold lots of stuff, one negative makes no difference. Whilst academics will become increasingly aware of the newer metrics, some may be shocked by the succession of zeros by their outputs, especially when so many could be highly cited. The solution is to explain why this happens should they wish build on that score. The scores do not convey the quality of their work or standing, but for those wanting to reach out or looking for feedback on how this is going, then the number 0 is a sign that the only way is up.
This blog post originally appeared in the LSE Impact of Social Science Blog and is republished under a Creative Commons CC Attribution 3.0 Licence.
Books are like buses, you wait ages for one to come along and two come at the same time, or something like that. This is the case in Information Resources as Andy Tattersall his and colleague Anthea Sutton alongside fellow ScHARR library and information guru Andrew Booth have published books.
Anthea and Andrew's book, Systematic Approaches to a Successful Literature Review is a second edition of their popular book for Sage and an essential read for anyone wanting to conduct a high quality literature review. Fellow ScHARR colleague and previous member of Information Resources Diana Papaioannou also contributed to the title which came out this month. Whilst Andy has delivered an edited book for Facet that looks at Altmetrics and the potential for research and libraries.
Systematic Approaches to a Successful Literature Review
Showing you how to take a structured and organized approach to a wide range of literature review types, this book helps you to choose which approach is right for your research. Packed with constructive tools, examples, case studies and hands-on exercises, the book covers the full range of literature review techniques,
New to this edition:
· Full re-organization takes you step-by-step through the process from beginning to end
· New chapter showing you how to choose the right method for your project
· Practical guidance on integrating qualitative and quantitative data
· New coverage of rapid reviews
· Comprehensive inclusion of literature review tools, including concept analysis, scoping and mapping
With an emphasis on the practical skills, this guide is essential for any student or researcher needing to get from first steps to a successful literature review.
Altmetrics A practical guide for librarians, researchers and academics
Whilst Andy Tattersall has published an edited book for Facet Books on the topic of altmetrics. The book also features a chapter from the ever busy Andrew Booth and fellow Information Resources member Claire Beecroft. There are also contributions from Euan Adie at Altmetric.com, Ben Showers who has published previously for Facet on the topic of bibliometrics, and a chapter from William Gunn at Mendeley.
The book also came out this month and hopes to bridge the gap between practitioner and giving advice for library, information professionals and academics how they best make use of altmetrics.
This book gives an overview of altmetrics, its tools and how to implement them successfully to boost and measure research outputs.
New methods of scholarly communication and dissemination of information are having a huge impact on how academics and researchers build profiles and share research. This groundbreaking and highly practical guide looks at the role that library and information professionals can play in facilitating these new ways of working and demonstrating impact and influence.
Altmetrics focuses on research artefact level metrics that are not exclusive to traditional journal papers but also extend to book chapters, posters and data sets, among other items. This book explains the theory behind altmetrics, including how it came about, why it can help academics and where it sits amongst current measurements of impact.
During the second week of June, the University of Sheffield supported two teams of staff to walk over 120 miles over six days along the Trans Pennine Trail to to raise funds to support refugee academics and students here at the University of Sheffield. More details are available here.
From the IR Group, Sonia Rizzo and Louise Preston joined these teams on the 'One Day Challenge', an 18 mile walk from North Sheffield, via the outskirts of Barnsley and Rotherham, via Meadowhall and back to Sheffield. Neither of us had ever done anything of this magnitude before and were feeling quite nervous, but excited about the challenge ahead.
We set off early on the Friday morning from the picturesque surroundings of Tankersley Premier Inn! We were very keen to make good progress on the walk and found ourselves part of the group at the front. This meant that we were required to navigate the sometimes slightly difficult to find signposts of the Trans Pennine Trail as well as keeping a steady pace. Even though the sun didn't shine for us, the rain help off and it was great to meet and share time with other University of Sheffield colleagues.
Louise with a herd of deer behind her, in Wentworth Park (Mile 6)
We were fuelled by lovely home made fruit cake and a strong flask of coffee but as the walk got tougher, our thoughts turned to those colleagues who had already walked over 100 miles and more importantly, those that we were raising money for. We finally finished the walk after 5 hours and 40 minutes, and despite being told that it wasn't a race, it was nice to feel that we had really challenged ourselves.
The walk was a fantastic experience for a really deserving cause, made more resonant by the death of Jo Cox MP the day before the walk. We carried her memory with us and we were both utterly delighted to raise just short of £500 before we started on the Friday morning. Our total now stands at nearly £600 and if you would still like to sponsor us, our page is here.
Hello, my name is Joanna
Hewson. I recently spent 2 weeks at ScHARR on my year 10 work experience, working
5 days with RDS (Research Design Service) in the innovation centre and the
other 5 days with IR (Information Resources) in Regent Court. Over the course
of the two weeks, I have gained so many new skills and have got a real insight
into working life.
In RDS, I learned about how
the NIHR (National Institute for Health Research) can help with public
involvement and funding in people’s research and found out about what it is
like to do different jobs in the Research Design Service, such as what it is
like to work in DTS (Design, Trials and Statistics) and what the CTRU (Clinical
Trials Research Unit) do. Also, I helped set up for an event and afterwards,
collated the information into a spreadsheet. I also produced a spreadsheet,
table and list of organised dates for the Volidays scheme. This was interesting
and I feel I benefitted from it through learning about jobs that I wouldn’t
have known existed without this placement.
In IR, I found out about
what different jobs they do as well, such as working on ScHARRHUD (http://www.scharrhud.org/) or doing
infographics in the library. I also learned about e-learning and created some
work sheets for online students. As well as this, I found out about the
libraries on the campus and got a tour of a few. Also, I searched different
databases to collect information on a topic for research – I feel this is where
I learned a new skill in being able to use databases such as MEDLINE and Web of
Science. I’ve really enjoyed the fact that people have given me work to do that
may be new and sometimes challenging for me and not just small, easy jobs.
I’ve had a really great time
on this placement and everyone I have met or spoke to has been kind, welcoming,
helpful and patient with me if I didn’t understand something. It has been an
excellently valuable experience and has given me an idea as to what I may do when
I leave school.
Andy Tattersall has an edited book coming out in June on the topic of Altmetrics. Altmetrics - A practical guide for librarians, researchers and academics is published by Facet Books. As part of the book launch Andy has created a short video explaining altmetrics in addition to writing a blog post for Cilip which can read in full below. The book can be pre-ordered and purchased from various outlets. Facet Waterstones Amazon
Altmetrics: What they are and why they should matter to the library and information community
Altmetrics is probably a term that many readers of this blog will have heard of but are not quite sure what it means and what impact it could have on their role. The simple answer is that altmetrics stands for alternative metrics.
When we say alternative we mean alternative to traditional metrics used in research and by libraries, such as citations and journal impact factors. They are by no means a replacement to traditional metrics but really to draw out more pertinent information tied to a piece of academic work. A new way of thinking about altmetrics is to refer to them as alternative indicators.
Scholarly communication is instrumental to altmetrics
There is also the focus on scholarly communication as altmetrics are closely tied to established social media and networks. Scholarly communication is instrumental to altmetrics and much of what it sets out to measure. These include tools such as Twitter, LinkedIn and blogs as well others including Mendeley and Slideshare.
The main protagonists of the altmetrics movement are ImpactStory which was set up by Jason Priem who coined the term ‘altmetrics’. They are joined by Figshare, Altmetric.com, Mendeley, PLOS and Kudos, amongst others. These were mostly established by young researchers who were concerned that research was being measured on the grounds of just a few metrics. These were metrics that gave an unbalanced view of research and did not take into account the technologies that many academics were using to share and discuss their work.
Altmetrics is not just about bean counting, though obviously the more attention a paper gets whether that be citations or Tweets the more interesting it may be to a wider audience, whether that be academics, students or the wider world. The more Tweets a paper gets does not necessarily mean it is better quality than those that do not get Tweeted as much, but the same applied to traditional metrics, more citations does not always mean a great piece of research, it can occasionally highlight the opposite.
Altmetrics provide an insight into things we have not measured before
What altmetrics sets out to do is provide an insight into things we have not measured before, such as social media interaction, media attention, global reach and the potential to spot hot topics and future pieces of highly cited work. In addition altmetrics allows content to be tracked and measured that in the past had been wholly ignored. Such as datasets, grey literature, reports, blog posts and other such content of potential value.
The current system recognised a slim channel of academic content in a world that is diversifying constantly at a much faster pace than ever. The academic publishing model has struggled to catch up with the modern world of Web 2.0 and social media and therefore academic communication has been stunted. Tools such as Twitter, blogs and Slideshare have allowed researchers to get their content onto the Web instantly, often before they have released the content via the formal channels of conferences and publications.
Tools such as ImpactStory, Figshare and Altmetric.com look at the various types of scholarly content and communication and provide metrics to help fund holders, publishers, librarians, researchers and other aligned professionals get a clearer picture of the impact of their work.
Fundholders can see where their funded research is being discussed and shared, as can researchers who may get to discover their research is not being talked about; which at least gives them reason to perhaps act on that. Publishers can view in addition to existing paper citations, how else they are being discussed and shared. Library and information professionals have an important part to play in all of this.
What is the role of the library and information professional?
There are certain roles in the library and information profession that have plenty to gain by becoming involved with altmetrics. Firstly those that deal with journal subscriptions and hosting content in repositories can gain a new insight into which journals and papers are being shared and discussed via altmetrics. This becomes increasingly important when making yearly subscription choices when journal and book funds are being constantly squeezed. Obviously this is not a solution or get-out clause for librarians when deciding which subscriptions to cancel, as you should not always pick the most popular journals at the expense of minority, niche journal collections, but altmetrics do offer a new set of identifiers when making those tough budgetary decisions.
LIS professionals are often technically proficient and for those who deliver outreach services and support for academics and students there is much they can do to help explain the new forms of scholarly communication and measurement. Many library and information staff are expert users of social media and tools such as slideshare, Mendeley and blogs. Whilst library and information professionals are in the position where they are often in a neutral role, so can make informed decisions on what is the best way to aid staff discover and communicate research. These skills are starting to spread slowly within the academic community and LIS professionals are in an ideal position to capitalise on altmetrics.
Certainly how academic outputs are measured in the future is anyone’s guess. We could move away from metrics to something that focuses on case studies, or move more towards open public peer review of research. Certainly the impact factor and citation indexes are with us for the foreseeable future. It’s likely we will see an amalgamation of systems with some regarded as more uniform and formal than others.
As each month passes we see another set of tools appear on the Web that promises to aid researchers share, communicate and discover research, so we could be at risk of information overload and decision fatigue when it comes down to choosing the right tools for the job. The reality is that we are unlikely to discover a magic silver bullet solution for how we measure scholarly work. All of the options offer something and if they can be designed and coerced to work together better; scholarly communication and measurement could reach a plateau of productivity.
Yet this requires an awful lot more engagement from the academic community, one that is already under pressure from various angles to deliver research and extract from it examples of impact. Nevertheless, altmetrics clearly look like they are here to stay for the mid-term at the very least and are gaining acceptance in some parts of the research and publishing sphere.
For now I suggest you investgate Figshare, ImpactStory, Mendeley and Altmetric.com to name but a few in addition to signing up for an Altmetric.com librarian account and installing their web bookmarklet.
To summarise, if we were to draw a Venn Diagram with social media in one bubble, metrics in another we would clearly see librarians in the overlapping area alongside altmetrics. It’s really down to whether you want a share of that space?
In the latest issue of the Health Information and Libraries Journal, Andy Tattersall writes the editorial on Big Data and why library and information professionals should take notice of it. Big data is a much-used term these days, yet it's definition varies depending on who you talk to. Dan Ariely in a Facebook status update crudely, but accurately compared to teenage sex: “Everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it." In addition, who owns big data, or more importantly who's role is it to oversee and look after these large datasets, increasingly hosted on publicly accessible websites. Certainly there is much scope for librarians to get involved in big data as it falls under the remit of research data management, a role often carried out in the library or associated departments. The abstract of the article is below and subscribers to the journal can read the full editorial or at some point find the pre print full text via the White Rose Repository.
Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers.