Monday, 31 October 2016

ScHARR Information Resources Group at ISPOR European Congress

Image credit: Viennascape by Nic Piégsa
. This work is licensed under a CC BY 2.0 license.

This week, ScHARR Information Resources Group will be represented at the ISPOR 19th European Annual Congress by Suzy Paisley (Director of Innovation and Knowledge Transfer) and Anthea Sutton (Information Resources Group Manager).

Along with other members of the HEDS team, Suzy and Anthea are contributing to the busy conference programme with a poster presentation each.  Suzy on the topic of "Identifying Early Biomarkers of Acute Myocardial Infarction in the Biomedical Literature: A Comparison of Text Mining and Manual Sifting Techniques".  Anthea will be representing the Systematic Review Toolbox, in collaboration with Chris Marshall from YHEC.  Both posters can be viewed between the hours of 08:45 and 14:15 on Monday 31st October, with an author discussion hour at 13.15, do come along and find out about this work.

For the remainder of the congress, Suzy and Anthea can be found, along with the rest of the HEDS expert team, at exhibition Stand no. 20/21.  This provides a great opportunity to find out about opportunities to work with ScHARR to support key strategic developments in your organisation, to collaborate with us on research and to participate in our diverse, world class learning and teaching programmes.

The full ScHARR HEDS congress activity can be found here. We'll be tweeting from @ScHARR_IKT during the congress, so please follow us there.

Friday, 28 October 2016

LSE Book Review of Altmetrics: A Practical Guide for Librarians, Researchers and Academics



Image of Andy Tattersall
Andy Tattersall
Altmetrics: A Practical Guide for Librarians, Researchers and Academics, edited by Andy Tattersall, provides an overview of altmetrics and new methods of scholarly communication and how they can be applied successfully to provide evidence of scholarly contribution and improve how research is disseminated. The book, which draws on the expertise of leading figures in the field, strongly encourages library and information science (LIS) professionals to get involved with altmetrics to meet the evolving needs of the research community, finds Nathalie Cornée.

Altmetrics: A Practical Guide for Librarians, Researchers and Academics. Andy Tattersall (ed.). Facet Publishing. 2016.

Find this book: amazon-logo

altmetrics-cover

Back in 2010, a new field of scholarly communication research was burgeoning: altmetrics. Altmetrics (initially standing for alternative metrics) are part of the broader range of scholarly metrics, such as the impact factor, citation counts or the h-index. They primarily intend to provide an indication of online and social media attention to any research outputs (as opposed to established metrics focusing mainly on peer reviewed publications only) by capturing their social influence. By doing this, they aim to improve our understanding about how information about research propagates, how it is used and how scholars are engaging with these new forms of scholarly communication.

Today, altmetrics are no longer regarded as alternative, but rather as complementary to traditional metrics. Many advocate their use as ‘early indicators’ of article usefulness. Indeed, research that used to take months or years to reach readers can now find them almost instantly via blogs, Wikipedia, social media networks, etc. Activities that used to be hidden, such as reading or downloading a paper, are now visible and therefore traceable (see Ben Showers, Chapter Four). Many stakeholders within academia are looking for new ways to measure how outputs are consumed online before they even start accruing citation counts (which take years for most disciplines).
In Altmetrics, the authors begin by explaining where altmetrics sit within the research landscape, the importance of research evaluation for scholarship and employment decisions, benchmarking purposes, funding opportunities, etc, as well as the notion of prestige or influence which is deeply rooted within academia.

Chapter Three, ‘Metrics of the Trade – Where Have We Come From?’ by Andrew Booth, provides a comprehensive review of the established metrics, and is a must-read for anyone less familiar with the broader world of scholarly metrics. By explaining their goal as well as their actual use in assessing individuals, groups or journal performance, Booth opens up the context in which altmetrics started to flourish and the gap that they have been trying to fill.
Throughout the book, Andy Tattersall insists on the cultural shift that academia has witnessed over the last decades: namely, since the development of the Internet and its related technologies (including MOOCs, Big Data, Open Access). Even though scholars were initially relatively quick to adopt some of the new means of communication that the digital world had to offer, such as emails, Tattersall reminds us that a vast majority of scholars tend to be rather apprehensive in utilising some of the new means of scholarly communication, firstly because of the downpour of technologies and platforms now available to them, and secondly because they rightly question their validity.
altmetrics-image
Image Credit: (dirkcuys CC BY SA 2.0)
While the first half of Altmetrics focuses mainly on setting the scene for new scholarly communications, the second half tends to emphasise the vital role that library and information professionals can play in helping staff discover and communicate research and ultimately reinforce their outreach activities within their own institutions.
LIS professionals are clearly the first target audience of this book, even though academics, publishers, funders or stakeholders of the research evaluation process could apply the recommendations to some extent. That said, throughout the book the authors stress how well-suited librarians are to supporting researchers. Indeed, librarians have developed a key presence within the research cycle by being experts in managing academic content either through collections, subscriptions or institutional repositories. They are also highly regarded within the academic community for their advice on copyright issues, support with information discovery and literacy, and have more recently become very proficient in facilitating open access publishing.

Chapter Ten, ‘The Connected Academic’, particularly struck a chord with me as Tattersall relays some of the major and very legitimate questions scholars tend to have about altmetrics, including ‘is this system good quality?’, ‘is this system stable?’ and ‘why use this technology, could it just be a fad?’ (141). If LIS professionals succeed in answering some of these questions that academics (or research administrators) may have by not providing them with just technical answers but rather by tailoring their response to each individual case, this will indeed help them strengthen their relationships and role within their organisation. I would have liked this chapter to go even further and provide successful stories of LIS professionals doing just that as every scholar will have different reasons for developing their online presence.

As the subtitle of Altmetrics stresses, it aims to be ‘practical’. Chapter Eight, ‘Resources and Tools,’ written by Tattersall, provides a short introduction to 41 resources including the major altmetrics tools as well as many social media platforms, some of which have an academic focus while others tend towards the mainstream. This list was useful in itself as some were new to me, but the real difficulty we face as LIS professionals is convincing our academics how valuable these tools can be to them and which ones to select and invest time in. Here again, concrete examples of scholars having developed strategies and workflows in which they have effectively combined these various outreach activities of sharing, connecting and measuring would have been beneficial. Tattersall does include, however, some helpful tips and tricks, such as identifying a ‘twin’ to demonstrate the value of disseminating research and engaging online (150). The author defines this as someone who would be a scholar’s highly respected peer, but based in another organisation and who has been successfully active on social media platforms.

The field of altmetrics has grown exponentially to the point that they are now considered as part of the basket of metrics recommended by the Leiden Manifesto or the Metric Tide Report, which both advocate the use of responsible metrics. The field has also attracted lots of groundbreaking research about the opportunities and challenges they bring in terms of their meaning or validity. Altmetrics: A Practical Guide for Librarians, Researchers and Academics is very welcome as it is one of very few textbooks revisiting the theory behind the growth of altmetrics, providing a comprehensive snapshot of what they look like today and demonstrating their value if applied in a meaningful manner. All in all, this is a worthwhile read, especially for any LIS professional interested in improving their understanding of altmetrics.




Nathalie Cornée is LSE Library’s Research Information Analyst. In her role, Nathalie focuses on providing support and training in all aspects of bibliometrics and citation analysis to researchers, administrative and research support staff to help them in getting some understanding of how the metrics are calculated and how they can be used to maximise the visibility and exposure of their research findings.
Note: This review gives the views of the author, and not the position of the LSE Review of Books blog, or of the London School of Economics. 

Creative Commons Attribution-NonCommercial-NoDerivs 2.0 UK: England & Wales 

Thursday, 27 October 2016

An Introduction to using Social Media to Communicate Research - 1-day course: Thursday, 24th November 2016





Image of Andy Tattersall
Andy Tattersall
Claire Beecroft
Claire Beecroft
Andy Tattersall and Claire Beecroft are running a one day workshop in Sheffield -  An Introduction to using Social Media to Communicate Research. The workshop aims to give academics and aligned professionals a comprehensive guide to social media and how it can be applied in scholarly communications.

The treadmill of academia is a relentless one: proposal, research, write, present and then hopefully publish before starting all over again, all in the hope that the research is recognised as being of good quality, worthy and valuable. There's one problem though - journals are not geared up for the modern online world of instant sharing and communication. Tools and ways of communicating research such as Twitter, YouTube, ResearchGate, Slideshare, blogging, infographics, animation and many others will be covered. The good news is they are mostly free and can work together to help research to reach a wider audience. That audience is not just academic peers, but publishers, editors, fund holders and the general public.

Course Overview
The aim of the workshop is to offer an introduction to the many tools you can use to help you communicate research and work smarter. The purpose of the day is to help attendees come away with a variety of tools and artefacts they can use to help communicate and share their work. We will teach you basics of social media in an academic setting and demystify some of the barriers that may have put you off from using these tools in your work.

We will show you how to make the most from these technologies and show you how to find out alternative ways of discussing and communicating research. Attention will be paid to the various ethical issues to working more on the web from copyright and Creative Commons to making more use of your mobile device, from safety and security to how you conduct yourself online and netiquette.

Who will benefit from this course?
This short course will benefit a wide range of people including (but not exhaustive of):


  • Researchers,
  • Masters and PhD students,
  • Research Support Staff and Managers,
  • Library and Information Professionals,
  • Communications and Marketing Professionals.

Date and Times
1-day course:  Thursday, 24th November 2016
Start:  9:30 am
Finish: 4:30 pm

Fees

£400 - Standard Rate for confirmed bookings

Booking and Payment


Provisional bookings are now being accepted. Please email scharr-scu@sheffield.ac.uk to reserve your place. You will then be contacted when the course has gone live on the Online Store, where all our bookings are processed.

All our bookings are processed via our Online Store. Payment is by Credit/Debit Card or PayPal. If you are a UK organisation and would prefer to be invoiced, then please select this option on our Online Store and ensure that all invoice details are provided (contact email address, full address, purchase order number) and also forward a copy of the Purchase Order to scharr-scu@sheffield.ac.uk.
Last Booking date for this course is midnight on Sunday, 13th November 2016.
If you have any queries regarding our booking process then please do not hesitate to contact us.

Meals and Accommodation

The course fee includes lunch and refreshments throughout the day plus all course materials provided on USB and teaching fees.  NB:  Accommodation is NOT included.
If you have any particular dietary or access requirements then please contact the Short Course Unit with your requirements at the time of booking.

Venue


Halifax Hall Hotel & Conference Centre

Endcliffe Vale Road, Sheffield, S10 3ER.
www.halifaxhall.co.uk

Contact

For further information please do not hesitate to contact the Short Course Unit via email at scharr-scu@sheffield.ac.uk


or call +44 (0)114 222 2968.








Wednesday, 12 October 2016

The number behind the number: suggesting a truer measure of academic impact

A close colleague and friend of Information Resources, Dr Chris Carroll has just published a paper on the limitations of citations, in particular the single counting of them within a paper's reference list. Chris wrote about his paper in the excellent LSE Impact Blog and we thought it would be of interest to visitors of the Information Resources Blog. We have republished it under their Creative Commons licence. Dr Carroll is a member of HEDS and worked previously as an information specialist as part of our team.
The limitations of simple ‘citation count’ figures are well-known. Chris Carroll argues that the impact of an academic research paper might be better measured by counting the number of times it is cited within citing publications rather than by simply measuring if it has been cited or not. Three or more citations of the key paper arguably represent a rather different level of impact than a single citation. By looking for this easily generated number, every researcher can quickly gain a greater insight into the impact (or not) of their published work.
The academic research and policy agenda increasingly seeks to measure and use ‘impact’ as a means of determining the value of different items of published research. However, there is much debate about how best to define and quantify impact, and any assessment must take into account influence beyond the limited bounds of academia, in areas such as public policy. However, within academia, it is generally accepted that the number of times a paper is cited, the so-called ‘citation count’ or ‘citation score’, offers the most easily measured guide to its impact. The underlying assumption is that the cited work has influenced the citing work in some way, but this metric is also viewed as practically and conceptually limited. The criticism is loud and frequent: such citation scores do not tell us how a piece of research has actually been used in practice, only that it is known and cited.
Image credit: beyond measure by frankieleon. This work is licensed under a CC BY 2.0 license.
Is there a better solution?
So, the obvious answer is to see if there is an easy way to measure how a paper has been used by its citing publications, rather than simply recording whether it is cited (the standard ‘citation score’). I did this by using one of my own frequently-cited papers as a case study and applying a basic metric: impact of the case study paper was considered to be ‘high’ if it was cited three or more times within the citing publication; ‘moderate’ if cited twice; and ‘low’ if cited only once.
The case study paper had 393 unique citations by November 2015: 59% (230/393) of publications cited the paper only once, suggesting its impact on these publications was low; 17% (65/393) cited it twice, suggesting moderate impact; but 25% (98/393) cited it three or more times, suggesting that this paper was having a genuine influence on those studies. The citation frequency within this ‘high impact’ group of publications ranged from three to as many as 14 in peer-reviewed studies and 17 in academic dissertations (with their longer word count). Primary research studies published in peer-reviewed journals were the principal publication type across all levels of impact.
I also noted where these single or multiple citations appeared within these publications. Single citations tended to appear only in the introduction or background sections of papers. These instances appeared to be simple citations ‘in passing’, the necessary ‘nod’ to existing literature at the start of a publication. However, when there were three or more citations, they tended to appear across two or more sections of a paper, especially in the methods and discussions, which suggests some real influence on the justification or design of a study, or the interpretation of its results. These findings on the location of the citations confirmed that the number of times a paper was cited within a publication really was a good indicator of that paper’s impact.
Pros and cons
A metric based on within-publication citation frequency is unambiguous and thus capable of accuracy. The assessment of where the citations appear in the citing publications also provided contextual information on the nature of the citation. In this case, it confirmed the viability of within-publication citation frequency as an impact metric. The approach is transparent and data can be verified and updated. The proposed metric, the ‘citation profile’, is not a measure of research quality, nor does it seek to measure other forms of impact or to address issues such as negative findings producing relatively lower citation scores. It merely seeks to contribute to the debate about how citation metrics might be used to give a more contextually robust picture of a paper’s academic impact.
Of course, even this ‘citation profile’ is not without its problems, but it is arguably less of a“damned lie” or “statistic” than some other metrics: only a more in-depth analysis of each citing publication can more accurately gauge a paper’s influence on another piece of research or a policy document. Yes, there are also questions concerning this metric’s generalisability: as with other bibliometrics, publications in different disciplines are also likely to have different citation profiles. However, these issues can be easily addressed by future research, given the simplicity of the metric.
If academics or funders want to improve their understanding of how research is used, then this easy-to-generate metric can add depth and value to the basic, much-maligned, and ‘damned’ ‘citation score’.
This blog post is based on the author’s article, ‘Measuring academic research impact: creating a citation profile using the conceptual framework for implementation fidelity as a case study’, published in Scientometrics (DOI: 10.1007/s11192-016-2085-0).
Note: This article gives the views of the author(s), and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the author
Chris Carroll is a Reader in Systematic Review and Evidence Synthesis at the University of Sheffield. His role principally involves synthesising published data to help inform policymaking, particularly in the fields of medicine and health, as well as the development of methods to conduct such work.
Print Friendly

Tuesday, 4 October 2016

Librarian on the run!

Claire Beecroft
Claire Beecroft
Continuing ScHARR Library's support for the Sheffield Charity Inspiration for Life, Claire Beecroft (that's me) and Angie Rees from the ScHARR Library team will be running the 'challenging' (for me at least) TenTenTen 10k in Endcliffe Park and Whiteley Woods on Sunday 9th October. This is a trail race with four nasty hills and lots of opportunities to fall over in the mud. Yay!

Inspiration for Life is the charity set up to honour and celebrate the memory of Tim Richardson, a University of Sheffield colleague, who passed away on 5 February 2013 from cancer. More recently, following the passing of another colleague, Victoria Henshaw, also from cancer,  the charity has also dedicated itself to supporting the memory and work of both these special people. They aim to promote lifelong learning, and encourage the public understanding of science through publications, lectures and other events. They also directly support local cancer charities.

I have done a fair few races over the years, though nothing beyond 5k for a while, and you can hunt me down at Sheffield Hallam Parkrun most Saturday mornings- here's me at last week's (just to prove I CAN run):


                                                                               Image by Dougalpics

You can sponsor the team I'm running for at: https://mydonate.bt.com/events/timsteam2016

Posted by Claire.