Search This Blog

Thursday, 16 February 2017

Gone In 60 Seconds

Mark Clowes
Mark Clowes
Mark Clowes braves the tough crowd that is the HEDS Section Meeting to give a one-minute presentation about his work.

One of the stranger things about working in an academic school rather than a traditional library is the departmental meeting, where you can find yourself sitting alongside people whose jobs have very little in common with your own.

The ScHARR Information Resources team sit within the Health Economics and Decision Science section of ScHARR, surrounded by systematic reviewers, economists, modellers and statisticians.   To help these different professional groups understand each other better, section meetings begin with quickfire one-minute presentations from each group known as "Gone In 60 Seconds".

As a Nicolas Cage fan (I even watch his really bad films, and God knows there are plenty) - and, not least, because it was my turn - I agreed to take part. But what would be "gone" in those 60 seconds?  My career prospects?  My credibility with colleagues (if I ever had any)?  Would I be challenged for hesitation, repetition or deviation?

I enjoy giving presentations and don't usually get too nervous; but the audience for this one (and the timing, with the expiry date of my contract approaching) made me particularly keen to impress. 

I decided to give my first airing to a topic about which I hope to present at one or two conferences in the summer - using a text-mining and data visualisation app (VOS Viewer) to deal with a large number of references retrieved by a systematic review.  I chose this topic to demonstrate to colleagues that IR staff were continuously experimenting with new technology and ways of working, and - since it has the potential to influence the scope of future review projects - because it would have relevance to all the different groups in the room.  An added bonus was that I could display some pretty images of the "heat maps" produced by VOS Viewer on the screen, which would take the audience's eyes off me.

The short format required more preparation than usual - generally I don't like to work from a script, preferring to maintain a conversational tone and improvise around bullet points - but my initial attempts to do so on this topic ran significantly over time.

In the end, I realised I was going to have to write out what I wanted to say in full - initially using free writing with pen and paper, then gradually refining and paring it down until I could beat the kitchen timer countdown (this was one of those tasks I could only have done working at home - colleagues would think I had lost the plot walking around reciting the same presentation over and over again).

I didn't want it to be a dry, technical presentation (in any case, there wasn't enough time to explain in depth how the software worked) so instead came from the angle of "why is this useful?" - i.e. for dealing with a common problem of facing too many references to sift in the traditional way, but potentially too important to ignore.

On the day, I think it went pretty well - people seemed engaged with what I was saying, although a slight technical hitch with my slides meant that I didn't quite manage my closing sentence before I (5...) was (4....) ruthlessly (3...) cut (2...) off (1...)


Monday, 6 February 2017

The Systematic Review Toolbox

Image of Anthea Sutton
Anthea Sutton
Last week I was invited to demonstrate the Systematic Review Toolbox (SR Toolbox) at our in-house Systematic Reviews Issues and Updates Symposium (SYRIUS) at ScHARR (School of Health and Related Research) at The University of Sheffield.  The symposium provides an opportunity for researchers to get together and share updates of systematic review methodological work being undertaken in ScHARR, so it was a great opportunity to promote the SR Toolbox and the resources it contains.  The SR Toolbox slot on the symposium programme consisted of a short presentation introducing the toolbox with a potted history of its creation and development.  This was followed by a live demonstration, which included some tips on using the toolbox.  Here are a few of those tips:
1.  You can use “Quick Search” to search for more than just the tool name
You can use “Quick Search” to search for tools by name, but you can also use it to search the titles and descriptions of tools. For example, if you’re interested in finding tools for critical appraisal, type it into the “Quick Search” box and you will find tools that mention “critical appraisal” in either the title or description of the tool. However, be aware this is exactly what it says it is, a “Quick Search”, so if you want a comprehensive list of all the critical appraisal tools in the toolbox, be sure to use “Advanced Search”.

2.  In “Advanced Search” selecting more than one feature will search for the features using “AND”
When using “Advanced Search”, it is important to note that if you select more than one feature, the toolbox uses the Boolean Operator “AND” and will return tools that meet all the features you selected.

3.  If you want to browse all the tools in the toolbox…
For Software Tools, tick the “Any” box underneath where it says “Check ‘Any’ if not concerned about any specific features”.
For “Other Tools”, check all 4 “Find me” boxes.
4.  The toolbox provides references to tool-related journal articles where available
The tool records in the toolbox link to/reference journal articles where they are available.  This might be an article about the development of the tool or a review of using the tool by a systematic reviewer who’s tried it out. If you know of any articles relating to tools in the toolbox, please get in touch and we will update the tool record accordingly.
I concluded the session by discussing the “community-driven” aspect of the toolbox.  Systematic reviewers and tool developers are encouraged to submit tools to the toolbox via the “Add a New Tool” feature.  The remit of the toolbox is to catalogue both software and other types of tools/supporting mechanisms (such as checklists, guidelines and reporting standards).  So if you discover a new tool that meets these criteria, please share it with your systematic review peers and colleagues by submitting it to the toolbox, which will help to continue the development of this really useful resource.

This post originally appeared on the Systematic Reviews Toolbox website and has been reproduced with permission.

Friday, 3 February 2017

Disentangling the academic web: what might have been learnt from Discogs and IMDB

Image of Andy Tattersall
Andy Tattersall
In recent years there has been huge, rapid growth in the number of online platforms and tools made available to academics carrying out their research activities. However, for many, such choice can lead to decision fatigue or uncertainty as to what is most appropriate. Andy Tattersall reflects on the success of Discogs and IMDB and considers what problems a similar site dedicated to academic research might help to solve; from version control and unique identifiers to multiple, diverse research outputs and improved interactions with data.
Academia can always learn a lot from the rest of the world when it comes to working with the web. The project 101 Innovations in Scholarly Communications is a superb case study, highlighting the rapid growth in academic and associated web platforms. As a result there is an increasing problem for academics when they come to choose their platform or tool for carrying out their work on the web. Choice is good, but too much can lead to decision fatigue and anxiety over having to adapt to more and more new tools and make decisions as to their value. In the last decade various organisations, academics and start-ups have noticed gaps in the market and created tools and websites to help organise and communicate the work of academics. This is now arguably having the negative effect of researchers not knowing where to invest their time and energy in communicating, sharing and hosting their work, as no one can use every platform available. Even by linking many of them there are still issues around their maintenance and use.
In hindsight, academia could have learned from two successes of the internet era. Discogs and the Internet Movie Database (IMDB) are two of the most popular websites on the planet. Each is authoritative and seen as the ‘go to’ platforms for millions of users interested in music and film respectively. IMDB is ranked at #40 and Discogs at #799 in Alexa, a global internet ranking index of websites. IMDB was one of the original internet sites launched in 1990, with Discogs arriving a decade later in 2000. Whilst there are other similar websites, there are few that even come close to their user numbers and the huge amount of specialised content they host. By contrast, academia has tried desperately to place large swathes of information under umbrellas of knowledge, but it all feels a bit too much like herding cats.
Image credit: Tangled Weave by Gabriel. This work is licensed under a CC BY 2.0 license.
Academia has always made use of the web to have discussions, host research and institutional websites but has failed to control the number of newer platforms that promise to be an essential tool for academics. Over the last decade – and notably in the last five years – hundreds of tools that aim to enhance a researcher’s workflow, visibility and networks have been created. Many of these do indeed offer a service: Figshare hosts research outputs; Mendeley manages references; and Altmetric.com tracks attention. They are all superb and offer something befitting academia in the 21st century. The problem for many academics is that they struggle to engage with these tools due to the overwhelming number to choose from. If you want to manage references, do you use Endnote, Mendeley, ZoteroRefMeReadCube or Paperpile? If you wish to extend your research network do you sign up for ResearchGateGoogle ScholarAcademia.eduPiirusLinkedIn or even Facebook? This is before we tap into the more niche academic social networks. Then there is the problem of visibility; how do you make sure your fellow academics, the media, fund holders or even members of the public can find you and your work? ORCiD obviously solves some of this, but it can be seen as a chore and another profile that needs configuring and connecting.
As research in the 21st century continues on its current trajectory towards openness and impact, and as scholarly communications develop, there will no doubt be yet more tools and platforms to deal with all that content and communication. If we think about making data accessible and reusable, post-publication open peer review, as well as making other research outputs available online, we may see a more tangled web than ever before.

What Discogs could teach us

Like so many of the post-Web 2.0 academic interactive platforms, content is driven by the users, those being academics and supporting professionals. Of course, a large number of formal research platforms have remained as they were, hosted by institutions, research bodies, funders and publishers. Yet more and more research outputs are being deposited elsewhere such as GitHub (which has a comparable internet ranking to IMDB), Figshare, Slideshare, ResearchGate and Google Drive, to give just a few examples.

How can we compare the research world with Discogs?

In my mind Discogs is not too dissimilar to the research world and all of its outputs. Listed below are some of the similarities between them. Those who have used Discogs will hopefully make the connection quicker than those who have not.
Image of a table comparing academia with Discogs
A comparison of academia and Discogs 
IMDB and Discogs can be searched in various different ways, all of which allow a person to drill deeper into an area of the database or move around serendipitously using the hyperlinks. So with Discogs you may know the title of a song but not the artist, or you may know what label it was released on. You may also be keen to track down a particular version of a release based on geographical, chronological or label data. The search functions of Discogs may not be as complex as a research database such as Medline, but for the typical Discogs user this is not essential.
Image of Discogs webpage
What are the big problems a Discogs or IMDB-type site could solve?
Version control
With growing interest in academic publishing platforms that capture the various stages of publishing research, there is a problem of ensuring those searching for that research find the version they really want. We have the final, peer reviewed, accepted and formatted version; the report the paper may have contributed to; the pre-print; the early draft; the research proposal; and the embryonic research idea. Research platforms such as ROI aim to capture as much of this research process as possible.
Unique identity
ORCiD is a great tool for aligning the work of one person to their true identity (especially so for early career researchers or academics who change their name mid-career, for example). You do not have to have the common surnames of Smith, Brown, Taylor or Jones to be mistaken for another researcher, less common-named academics also have this problem. If a researcher publishes using their middle initial and then without, it can create multiple identities in some databases and tying them all together is not always straightforward and can be time consuming. In Discogs, an artist or band is listed with all name variations collected under the most commonly used title. ORCiD allows this, but sadly the problem is already very extensive.
Additional research outputs
The mainstay of academic output is the journal paper but that is not the case for some areas of research. There are artistic performances, computer code, software, patents, datasets, posters, conference proceedings, and books, among others. Some stand alone, whilst there are increasing numbers of satellite outputs tied to one piece of research. For example, in Discogs we might think of the LP album as the definitive item and of the single, EP or digital release as outputs resulting from that. For research this may be the report or journal paper with attached outputs including a poster, dataset and conference presentation.
Interaction with the research data
Each of Discogs and IMDB allows users to interact with its huge database of information. Users can set up accounts, add music and films to their personal collections, leave reviews and contribute knowledge. To flip that into an academic context, that might mean users saving research artefacts to a reference management package, leaving open peer review comments and contributing their own insights and useful resources.
Such a platform would not operate in isolation, as there would still be a need for other connected web presences to exist. Social media, such as Twitter, to communicate to wider audiences; publication platforms to host all of the published research; tools to manage references and track scholarly attention. Other tools would also be needed to help conduct the research, analyse and present results and data, create infographics, take lab notes, collaborate on documents and create presentations. Then there is the issue of who would oversee such a huge database, manage it and ensure it is kept largely up to date. Of course with something similar to Discogs and IMDB anyone could enter and update the content, with proper accreditation, audit trail and moderation. Such a platform would have been accessible to funders, charities and public, with certain limitations on access to certain content. Hindsight is a wonderful thing and given how IMDB and Discogs have grown into such well-known and used platforms it is a shame that the same did not happen in academia to help create such a central hub of knowledge and activity.
Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the author
Andy Tattersall is an Information Specialist at the School of Health and Related Research (ScHARR) and writes, teaches and gives talks about digital academia, technology, scholarly communications, open research, web and information science, apps, altmetrics and social media. In particular, how these are applied to research, teaching, learning, knowledge management and collaboration. Andy is a member of the University of Sheffield’s Teaching Senate and a Senior Fellow of the Higher Education Academy. He was the person who sparked interest in running the first MOOCs at his institution in 2013. Andy is also Secretary for the Chartered Institute of Library and Information Professionals – Multi Media and Information Technology Committee. He has edited a book on Altmetrics for Facet Publishing which is aimed at researchers and librarians. He tweets @Andy_Tattersall and his ORCID ID is 0000-0002-2842-9576.
Creative Commons Licence
This work was originally published on the LSE Impact Blog and is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated.