Showing posts with label research. Show all posts
Showing posts with label research. Show all posts

Wednesday, 18 July 2018

Nothing lasts forever: questions to ask yourself when choosing a new tool or technology for research

Image of Andy Tattersall
Andy Tattersall
Academia has become increasingly reliant on third-party tools and technologies to carry out many of the processes throughout the research lifecycle. But there are genuine concerns about the sustainability of some of these tools and what the implications would be for users in the event they were discontinued. Andy Tattersall suggests a series of straightforward questions researchers should ask themselves before choosing a new technology for use in their research. Can you export your content? Is there an alternative? After all, there is no guarantee your favourite tool will still be around tomorrow.
Academia has not always been good at adopting new technologies to aid research and teaching. Even a tool as seemingly popular and simple to use as Twitter has been received with some anxiety and trepidation within the scholarly community. There are various reasons for the slow uptake of new technologies, something not exclusive to the academic community, as captured in Everett Roger’s Diffusion of Innovations. Technology continually changes and the pressures of keeping up with it can actually cause inertia and some to bury their heads in the sand rather than engage with the changing environment. There are genuine concerns about the sustainability of tools we rely on in the academic community, with no guarantee that popular tools like Google Scholar or Twitter will be with us this time next year.

Adopting technologies that eventually cease business

There are several examples of really useful tools to have been accepted by the academic community only to pull down the virtual shutters for good. It can be quite depressing to have invested time and energy in mastering a tool only for it to disappear offline. This may happen for a variety of reasons, such as a lack of investment (financial or development), slow uptake, or the founding individual moving onto a new venture. Those in academia want solid, factual reasons to utilise a new tool; if the one they currently use works fine, why switch to another they haven’t heard of? It can be like the problem of buying a new laptop: why purchase one now when you could buy one with double the processing power for the same price a year later? Sadly that attitude means you end up not moving on at all. Academia is about finding answers to problems and learning from previous mistakes – surely the same should apply to the very tools we use to achieve better outcomes?
There are several issues around adopting technologies to carry out, communicate, and analyse research, issues further complicated by the duplication of platforms or providers’ expansions into new areas of business. Take Mendeley, for example, which started as a social network and reference management tool but has since expanded into a data-hosting and a funding-search service.

The sad demise of useful platforms

Google Reader, PageFlakes, Readability, Silk and Storify have all ceased business in recent years despite demand for their services. In some cases this can be problematic for users as they have invested great amounts of time in curating their own content, particularly so in the case of personalised dashboard PageFlakes or data visualisation site Silk. Thankfully, for most of the aforementioned tools there were suitable alternatives and useful sites like alternativeTo, which directs users to similar options. In some cases the provider itself even pointed towards an alternative, such as Readability which used its front page to direct users to Mercury Reader. Others such as Storify proved more problematic, with no immediate like-for-like tool obviously available and Wakelet seeming the best alternative.

Choosing the right tool for the job

For anyone working with academics to adopt new tools, or for those more proactive academics wishing to explore new ways of working, there are several questions you should ask before adopting a new technology. For the most part these are straightforward and it is important to remember you may only use some technologies once.
  • Is it intuitive to use?
  • Is there an alternative?
  • Can you export your content?
  • What are they doing with your data?
  • How often will you use the technology?
  • Do you know anyone using this tool already?
  • Has the technology been around for long?
  • Who created the technology and who owns it?
  • Are the developers on social media and how often do they post new updates?

Nothing lasts forever

Academia is becoming increasingly reliant on technology, especially third-party tools, to carry out certain research processes. This has long been the case, with tools such as Dropbox or YouTube offering more functionality than in-house institutional platforms. With more tools comes greater diversity and potentially more problems. There is no guarantee we won’t see another dot.com crash like that of 2000, and this time academia would also feel its wrath. Many platforms, especially niche academic ones, are run by just a handful of staff or even students. They may have investors expecting a return on their capital, families with mouths to feed, or office bills to pay.
Another strand to this debate is the thorny subject of open-source versus profit-driven platforms within scholarly communications, as discussed in previous posts by Jefferson Pooley and Mark Hahnel. Some academics may prefer the open, community-driven nature of open-source technologies, believing these to be more aligned with core academic values. Yet rejecting all commercial platforms could mean cutting off your nose to spite your face, with open-source initiatives often hamstrung by technical and financial constraints that make them unsustainable.
Academia’s increasing reliance on these platforms to undertake a multitude of tasks – including carrying out, communicating, and measuring research and its impact – requires greater dialogue around sustainability. It is likely that popular third-party platforms used by the academic community such as Twitter, Facebook, Slideshare, Google Scholar, and YouTube will be here for some time. But what about the smaller niche tools that have been essential in changing and enhancing how academics carry out their work? One only has to look at Google Reader, PageFlakes, and the many others that are no longer in existence. Academia needs to be flexible and adaptable to the changes brought on by the shifting sands of technology but also pay attention to the tools you love the most but which might not be around tomorrow.
Originally published on the LSE Impact of Social Sciences Blog
This work is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated.

Monday, 8 January 2018

New research must be better reported, the future of society depends on it

File 20171220 4954 1k92h2l.jpg?ixlib=rb 1.1
Understanding how and why things happen can help people make sense of the world. Pexels
Andy Tattersall, University of Sheffield


Newspaper articles, TV appearances and radio slots are increasingly important ways for academics to communicate their research to wider audiences. Whether that be the latest health research findings or discoveries from the deepest, darkest parts of the universe.
In this way, the internet can also help to facilitate these channels of communication – as well as discussions between academics, funders and publishers, and citizen scientists and the general public.
Yet all too often research-led stories start with “researchers have found”, with little mention of their names, institution and who funded their work. And the problem is that by reporting new research in this way, it fails to break down the stereotypical image of an ivory tower. For all readers know these “researchers” might as well be wearing white lab coats with the word “boffin” on their name badges.

Rolling news

News is now a 24-hour operation. Rolling coverage of stories means journalists have their work cut out in maintaining this cycle. But that is no excuse for missing out important pieces of information that underpin a story.
Take for example a story relating to health research that has wide ranging societal impact. Supporting evidence, links and named academics help a story’s authenticity and credibility. And at a time when “fake news” is an increasingly sticky problem it becomes essential to link to the actual research and therefore the facts.



Accurate reporting, it’s not rocket science. Pexels

This is important, because research goes through a peer review process where experts in the same field of research critically assess the work before it can be published. This is similar to news stories that are edited to ensure they are of good quality – although this process takes far less time.

Accurate reporting

In academia there has been a huge move to make research openly available and therefore accessible for the whole of society. While research institutions are making great strides in public engagement and the wider understanding of science, media organisations still remain instrumental in that process.
And while it’s been claimed that the public are tired of experts, the impact they have on society – from building skyscrapers to keeping us alive – is undoubtedly fundamental to our existence.



Science and technology have changed the way we work, communicate, and view the world. Shutterstock

But poor or incomplete reporting undermines respect for experts by misrepresenting the research, especially by trivialising or sensationalising it. So while academics from various disciplines are often willing to talk to the media – either as an author or from an independent expert viewpoint – misreporting of research and particularly data (whether intentional or unintentional) has a negative effect.
Academics are then vilified as having something to hide or accused of making up their research, while members of the public are exposed to unnecessary anxiety and stress by inappropriate headlines and cherry picked statistics that are reported in a biased way.

The public good

Of course, not everyone will want to check the citations and research outputs – and not everyone has the critical skills to assess a piece of specialised academic writing. Yet there are lots of people who, given the opportunity, would be interested in reading more about a research topic.
Media coverage opens up a democratic debate, allows people to explore the works of an accomplished researcher and helps the public understanding of science. And in this way, fair and accurate reporting of research encourages academics to be willing to work with the media more regularly and build good working relationships.
The ConversationNot only that, but the proper and accurate communication of science is beneficial to the whole of society – from the government to its citizens. So in the age of “fake news” it is more important than ever to make sure that what’s being published is the truth, the whole truth and nothing but the truth.

Andy Tattersall, Information Specialist, University of Sheffield

This article was originally published on The Conversation. Read the original article.

Thursday, 14 September 2017

Cite Hacks - A new video series to support scholarly communications, digital academia and gain a few extra citations (hopefully)

Image of Andy Tattersall
Andy Tattersall
Over the last couple of years I have created three series of videos to help researchers and academics make more out of technology and the web to support their work. The first series was Research Hacks which appeared in 2015, Learn Hacks followed shortly and then last year App Hacks was launched. You might notice a bit of a theme here, but the purpose of these videos are to offer quick and simple suggestions for the progressive academic to work differently. They were part instructional and part inspirational and focused on a myriad of technologies, tools, websites and opportunities. The videos are usually shorter than three minutes long and are an introduction to such topics and how I can help others take advantage of them.



Cite Hacks
Cite Hacks are about what academics can do to improve their chances of getting cited. More than that, the videos are about making your research easier to discover and exploit fresh opportunities within digital academia. There is conflicting evidence as to the many ways you can improve citations but these videos offer opportunities for explore much more. If you don't try then you won't know. The exercise of blogging, making data and research open, using social media and using better keywords and titles are all part of where academia is heading.



Cite Hacks Playlist

Wednesday, 29 March 2017

Following the success of the learning technologist, is it time for a research equivalent?

Image of Andy Tattersall
Andy Tattersall
With so many scholarly communications tools and technologies now available, how do academics decide which are most appropriate for their research? Andy Tattersall suggests it might be time for a research equivalent of the learning technologist, a role that has helped drive innovations in teaching underpinned by technologies. The research technologist would be embedded within the university department, make recommendations on appropriate online tools, provide technical assistance and also offer guidance on accompanying issues of ethics or compliance. With the right ongoing support, academics can improve the communication, dissemination and impact of their research.
The research cycle is changing rapidly and a lot of that change is due to the proliferation of technologies and websites that support the research process. Many of the most useful tools have been captured by Jerome Bosman and Bianca Kramer in their excellent 101 Innovations in Scholarly Communications. Whilst this work is a great help to those aware of it, the reality is a majority of academics are either unaware of or unwilling to engage with the myriad tools and technologies at their disposal (beyond social networking sites such as Twitter, Facebook, ResearchGate, etc.). There are several reasons for this: workload and deadline pressures; fear of technology; ethical implications around their use and their application, especially when it comes to third party software; or too much choice.
The usefulness of these tools has been recognised by major publishers, who have made certain strategic investments in order to create their own research cycle workflows. So if the likes of Elsevier are looking to use these tools to change the research ecosystem, this should be of great interest to anyone who publishes with them, right? But with so many tools available, how do academics navigate their way through them? How do they make the connection between technology and useful application? And who helps them charter these scary, unpredictable waters?
Image credit: A Multitasking Busy Guy by uberof202 ff. This work is licensed under a CC BY-SA 2.0 license.
Lecturers and teachers have their pedagogy, what do researchers have?
If we look at applications of technology and social media in teaching, we can see more clearly how things have been implemented. Post-2004 and the advent of Web 2.0 there was an increased uptake of technology in the teaching community. The advent of virtual learning environments aided this, with the ability to employ discussion forums, blogs, video and, more recently, social media. Of course research has also taken advantage of these tools but the difference with teaching is that it was often led and facilitated by the learning technologist. This group of centralised, university-educated professionals help drive teaching innovations that are underpinned by technology – the clue is in their job title. The technology itself does not drive the teaching innovation but can help initiate and improve on it. By championing technologies with teaching staff, technologists have helped refresh higher education, making it more fit for the 21st century. They have helped shape learning and teaching through approaches such as blended and flipped classes, video and screen capture, fresh forms of assessment, use of mobiles, and social media. In many cases the innovation is led by the lecturer but, like research, in most cases it requires a good degree of guidance to get them there.
The research technologist
Whether we call it a research technologist or digital academic specialist, this role would not be too different from its learning technologist counterpart. It would support research and its dissemination in the use of video, animation, infographics, social media, online discussion, mobile device use, and social networks, to name just a few technologies. The learning technologist applies pedagogical reasoning for their technology choices, and the research equivalent would need to assess the same considerations. Not only that but good communication skills, information literacy, and an understanding of data protection, ethics, and what constitutes a good technology – and how it can be applied to a specific research setting in a sustainable and timely manner – are all essential. For example, the use of video to disseminate research around speech therapy would potentially be more useful than an infographic. In the same way, an infographic published in a blog post might be a better way of conveying the results of a public health project.
The reason why in-house support could benefit the practice and dissemination of research is that researchers are very pressured for time, and often don’t know what they need regarding research technologies and especially dissemination. Secondly, when they do know what they want, they often need it “as soon as possible”. These two problems are more solvable within the department, especially as researchers often don’t know where to go for specific help. The research technologist would be a designated, focused role, embedded within the department. They’d be a signpost to new ways of working, problem solving and, most importantly, be able to consider all issues of ethics and/or compliance when passing on advice. They’d become the “go-to” person for anyone wanting to use technology as part of their research.
More than just using technology
The issue of employing more technology in your research comes with various challenges. For example, with research that is sensitive, controversial or otherwise likely to attract negative attention, using social media does come with many issues. Instructing researchers to use Twitter to communicate their research is all well and good until they receive negative comments, especially abusive and threatening ones. Something like Twitter requires a technical explanation (e.g. how to use the block function or employ a dashboard like Tweetdeck) but also advice around negative comments, how, if and when to respond, when to block, and, in some cases, when to report to the platform, your institution or the authorities. Another example might be the copyright issues around ResearchGate or YouTube. Unless time is spent helping researchers understand how to use these tools and what the accompanying major issues are, those researchers will remain reluctant to use them at all. Additionally, the more those who use them have bad experiences, often through no fault of their own, the more likely others will see good reason to navigate around such opportunities. One bad experience on social media could put a researcher off using it for good. With the right ongoing support, these technologies can, in an impact-driven environment, help communicate and disseminate your research to wider audiences.
The role I am fortunate to have, information specialist, is akin to a learning technologist but I work more closely with researchers these days. My role was established a decade ago to look at how technologies can be leveraged to support my department. That extended to research and teaching staff, students and our own academic library. In that time I put my department on the path to their first MOOCs in 2013, edited a book on altmetrics, and championed Google Apps, as well as the use of video and social media on campus. Whilst I have seen the creation of new roles around learning technology, marketing and impact, there remain areas of support that fall between the cracks. This is where I pick up much of my work, supporting research and teaching colleagues around the use of video, infographics, social media and the many less attractive associated issues, like copyright, security, ethics, and the negative impact on productivity. I work closely with the centralised departments, which benefits all parties involved, and carry out some teaching, marking and write the occasional paper. In effect I am a hybrid model that is, hopefully, better able to understand the needs of all involved, including the centralised departments that work so hard to support researchers.
For teaching, which has always required librarians, IT technicians, and marketing experts, the learning technologist does not replace these roles, but complements them. The establishment of learning technologists within departments has helped bring teaching forward to take advantage of new technologies. For the same to happen within research it needs institutions to consider the learning technologist and explore whether there is value in developing an in-house research equivalent, a kind of “Swiss Army knife” professional, who can exploit the burgeoning number of opportunities afforded by the many new technologies out there.
Originally published in the LSE Impact Blog and republished under a Creative Commons Attribution 3.0 Unported License

Friday, 3 February 2017

Disentangling the academic web: what might have been learnt from Discogs and IMDB

Image of Andy Tattersall
Andy Tattersall
In recent years there has been huge, rapid growth in the number of online platforms and tools made available to academics carrying out their research activities. However, for many, such choice can lead to decision fatigue or uncertainty as to what is most appropriate. Andy Tattersall reflects on the success of Discogs and IMDB and considers what problems a similar site dedicated to academic research might help to solve; from version control and unique identifiers to multiple, diverse research outputs and improved interactions with data.
Academia can always learn a lot from the rest of the world when it comes to working with the web. The project 101 Innovations in Scholarly Communications is a superb case study, highlighting the rapid growth in academic and associated web platforms. As a result there is an increasing problem for academics when they come to choose their platform or tool for carrying out their work on the web. Choice is good, but too much can lead to decision fatigue and anxiety over having to adapt to more and more new tools and make decisions as to their value. In the last decade various organisations, academics and start-ups have noticed gaps in the market and created tools and websites to help organise and communicate the work of academics. This is now arguably having the negative effect of researchers not knowing where to invest their time and energy in communicating, sharing and hosting their work, as no one can use every platform available. Even by linking many of them there are still issues around their maintenance and use.
In hindsight, academia could have learned from two successes of the internet era. Discogs and the Internet Movie Database (IMDB) are two of the most popular websites on the planet. Each is authoritative and seen as the ‘go to’ platforms for millions of users interested in music and film respectively. IMDB is ranked at #40 and Discogs at #799 in Alexa, a global internet ranking index of websites. IMDB was one of the original internet sites launched in 1990, with Discogs arriving a decade later in 2000. Whilst there are other similar websites, there are few that even come close to their user numbers and the huge amount of specialised content they host. By contrast, academia has tried desperately to place large swathes of information under umbrellas of knowledge, but it all feels a bit too much like herding cats.
Image credit: Tangled Weave by Gabriel. This work is licensed under a CC BY 2.0 license.
Academia has always made use of the web to have discussions, host research and institutional websites but has failed to control the number of newer platforms that promise to be an essential tool for academics. Over the last decade – and notably in the last five years – hundreds of tools that aim to enhance a researcher’s workflow, visibility and networks have been created. Many of these do indeed offer a service: Figshare hosts research outputs; Mendeley manages references; and Altmetric.com tracks attention. They are all superb and offer something befitting academia in the 21st century. The problem for many academics is that they struggle to engage with these tools due to the overwhelming number to choose from. If you want to manage references, do you use Endnote, Mendeley, ZoteroRefMeReadCube or Paperpile? If you wish to extend your research network do you sign up for ResearchGateGoogle ScholarAcademia.eduPiirusLinkedIn or even Facebook? This is before we tap into the more niche academic social networks. Then there is the problem of visibility; how do you make sure your fellow academics, the media, fund holders or even members of the public can find you and your work? ORCiD obviously solves some of this, but it can be seen as a chore and another profile that needs configuring and connecting.
As research in the 21st century continues on its current trajectory towards openness and impact, and as scholarly communications develop, there will no doubt be yet more tools and platforms to deal with all that content and communication. If we think about making data accessible and reusable, post-publication open peer review, as well as making other research outputs available online, we may see a more tangled web than ever before.

What Discogs could teach us

Like so many of the post-Web 2.0 academic interactive platforms, content is driven by the users, those being academics and supporting professionals. Of course, a large number of formal research platforms have remained as they were, hosted by institutions, research bodies, funders and publishers. Yet more and more research outputs are being deposited elsewhere such as GitHub (which has a comparable internet ranking to IMDB), Figshare, Slideshare, ResearchGate and Google Drive, to give just a few examples.

How can we compare the research world with Discogs?

In my mind Discogs is not too dissimilar to the research world and all of its outputs. Listed below are some of the similarities between them. Those who have used Discogs will hopefully make the connection quicker than those who have not.
Image of a table comparing academia with Discogs
A comparison of academia and Discogs 
IMDB and Discogs can be searched in various different ways, all of which allow a person to drill deeper into an area of the database or move around serendipitously using the hyperlinks. So with Discogs you may know the title of a song but not the artist, or you may know what label it was released on. You may also be keen to track down a particular version of a release based on geographical, chronological or label data. The search functions of Discogs may not be as complex as a research database such as Medline, but for the typical Discogs user this is not essential.
Image of Discogs webpage
What are the big problems a Discogs or IMDB-type site could solve?
Version control
With growing interest in academic publishing platforms that capture the various stages of publishing research, there is a problem of ensuring those searching for that research find the version they really want. We have the final, peer reviewed, accepted and formatted version; the report the paper may have contributed to; the pre-print; the early draft; the research proposal; and the embryonic research idea. Research platforms such as ROI aim to capture as much of this research process as possible.
Unique identity
ORCiD is a great tool for aligning the work of one person to their true identity (especially so for early career researchers or academics who change their name mid-career, for example). You do not have to have the common surnames of Smith, Brown, Taylor or Jones to be mistaken for another researcher, less common-named academics also have this problem. If a researcher publishes using their middle initial and then without, it can create multiple identities in some databases and tying them all together is not always straightforward and can be time consuming. In Discogs, an artist or band is listed with all name variations collected under the most commonly used title. ORCiD allows this, but sadly the problem is already very extensive.
Additional research outputs
The mainstay of academic output is the journal paper but that is not the case for some areas of research. There are artistic performances, computer code, software, patents, datasets, posters, conference proceedings, and books, among others. Some stand alone, whilst there are increasing numbers of satellite outputs tied to one piece of research. For example, in Discogs we might think of the LP album as the definitive item and of the single, EP or digital release as outputs resulting from that. For research this may be the report or journal paper with attached outputs including a poster, dataset and conference presentation.
Interaction with the research data
Each of Discogs and IMDB allows users to interact with its huge database of information. Users can set up accounts, add music and films to their personal collections, leave reviews and contribute knowledge. To flip that into an academic context, that might mean users saving research artefacts to a reference management package, leaving open peer review comments and contributing their own insights and useful resources.
Such a platform would not operate in isolation, as there would still be a need for other connected web presences to exist. Social media, such as Twitter, to communicate to wider audiences; publication platforms to host all of the published research; tools to manage references and track scholarly attention. Other tools would also be needed to help conduct the research, analyse and present results and data, create infographics, take lab notes, collaborate on documents and create presentations. Then there is the issue of who would oversee such a huge database, manage it and ensure it is kept largely up to date. Of course with something similar to Discogs and IMDB anyone could enter and update the content, with proper accreditation, audit trail and moderation. Such a platform would have been accessible to funders, charities and public, with certain limitations on access to certain content. Hindsight is a wonderful thing and given how IMDB and Discogs have grown into such well-known and used platforms it is a shame that the same did not happen in academia to help create such a central hub of knowledge and activity.
Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the author
Andy Tattersall is an Information Specialist at the School of Health and Related Research (ScHARR) and writes, teaches and gives talks about digital academia, technology, scholarly communications, open research, web and information science, apps, altmetrics and social media. In particular, how these are applied to research, teaching, learning, knowledge management and collaboration. Andy is a member of the University of Sheffield’s Teaching Senate and a Senior Fellow of the Higher Education Academy. He was the person who sparked interest in running the first MOOCs at his institution in 2013. Andy is also Secretary for the Chartered Institute of Library and Information Professionals – Multi Media and Information Technology Committee. He has edited a book on Altmetrics for Facet Publishing which is aimed at researchers and librarians. He tweets @Andy_Tattersall and his ORCID ID is 0000-0002-2842-9576.
Creative Commons Licence
This work was originally published on the LSE Impact Blog and is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated.


Monday, 28 November 2016

Working with the media can be beneficial but linking to and citing your research should be compulsory

Image of Andy Tattersall
Andy Tattersall
It’s great when academic research is covered by the media but too often this coverage fails to link back to or properly cite the research itself. It’s time academics insisted on this and Andy Tattersall outlines the benefits of doing so. As well as pointing more people to your work, the use of identifiers allows you to track this attention and scrutinise where and how your research has been used. At a time when academic work is vulnerable to misreporting, such a simple step can help ensure the public are able to view original research for themselves.
 
Academics are increasingly being sold the benefits of working with the media as an effective way of gaining impact and presenting their work to a wider audience. Yet all too often media coverage of research has no direct link to the research it is referring to. The general public are used to seeing news stories that say ‘researchers have found’ or ‘researchers from the university of’ yet these reports are often lacking when it comes to linking to or citing the actual research. Academics dealing with the media should make a point of insisting on linking to their original research outputs where applicable as there are several benefits. Given that Oxford Dictionaries just named ‘post-truth’ as their word of 2016, we need to do everything we can to ensure fact retains its importance in the reporting of research.


Allow the public to see for themselves what the researchers found

How research is framed in the media can be very important as not all research is reported accurately. Giving links so that readers can fact-check is almost effortless if the corresponding academic insists on this at the point of writing the story. Of course this depends on how accessible the research is but there should be a link to the open access version or at the very least the abstract of the research. Certain national newspapers are very good at cherry-picking parts from a piece of research to provide an attention-grabbing headline. This can be extremely problematic in the reporting of health news and websites such as the NHS’ Behind the Headlines addresses misreporting of health news stories. The problem is that most people reading the news are not aware of such resources, but adding the original link to the research in the hypertext or as a reference at the end of the paper copy gives readers direct access to the published work. Of course that does not mean they will read the original work, but it does open up the possibility. It also saves interested parties from trying to track down the original paper, the title of which is rarely reported in full, so what is lost by adding the links to the research? Remember, it is much harder for a journalist to misreport your work if you insist on linking to what you actually wrote.

newspapers
Newspaper Stand by Yukiko Matsuoka.  CC BY 2.0 license.
Track mentions of your research

Tools such as Altmetric.com, Kudos and ImpactStory use unique identifiers to track the attention a piece of published research receives. So when someone publishes a peer-reviewed research article it receives a digital object identifier (DOI), or it could be a PubMed ID, ISBN, or other such identifier. If a piece of research is covered in the media and there is no link to the research via these identifiers it can miss out on being picked up by altmetric tools. The researchers may know about this coverage, and perhaps their institution’s media team might too, but what about departmental peers, managers, colleagues in the research office or library? What about the funders? All of these are interested parties and coverage in the media, whether this is a specialist research blog or an international publication, is worthy of attention, especially when we are trying to capture that elusive ‘impact’.


Follow the long tail of your scholarly communications

If you are a researcher working with the media to help disseminate your findings then it is presumable that you would be interested in how that research is being covered. With many online media platforms, whether blogs or news sites, it is common for an article to be republished elsewhere. If your work is covered on one media platform it might be picked up and published on another, and that second platform may carry more influence than the first. The problem is this: how do you know this has happened if there is no way of tracking back? Of course you might find your work covered on the web by carrying out a search, but that is hardly scientific. By insisting on linked DOIs or similar recognised identifiers then you should be able to discover where your news coverage has been republished using tools like Altmetric.com. In addition it allows you to discover how third party websites may have interpreted your research. You may not be interested in whether your research has been covered in the media, but I guarantee you would be if it was widely misreported.


Question the journalist’s motives

We cannot expect everyone who reads about published research in the media to fully understand what it might mean. That is why the media writes in such a way as to break down the scholarly communication into easier-to-read lay summaries. Yet researchers have to understand that if you work with the media it may report your research in a way that you do not totally agree with. Journalists may focus on one part of your research in particular, they may even be critical of it, and how they form the story may depend on their platform’s agenda, editor or owner. This problem is exacerbated by social media; the general population can now publicly comment on news stories and so potentially perpetuate the bias reported by inaccuracies in the original news story. The tone and angle applied by a journalist to a news story can potentially be addressed if links to the original research and lay summary are added to the news article.

If a journalist or news site is unwilling to link to your published research then you have to ask the question: why? Are they looking to put their own slant on your work and if so are they in a position of expertise to do this? The chances are that most have not thought about adding links or references to your work – they may not appreciate that you, your organisation or funding body might be interested in tracking it for impact. (Of course this leads to other questions around whether you should be talking about your research in the first place, but that is a conversation between you, your manager and funder.) The only way to address this is to ensure that all communications about your research with journalists, bloggers and media organisations are on the caveat that they track back to your published work and that this work has a unique, recognised identifier.

Newspapers
© LSE Impact Blog

What can researchers do?

Any academic knows that to cite another’s work in their own outputs they must cite it in the body text and add a reference to the research pointing readers to this supporting work. Students are taught this as being part and parcel of the process of conducting research. So it should follow that anyone dealing with the media should insist that their work is correctly cited and linked back to once online. Not only does this linking aid interested members of the general population find the research for themselves but also peers, research groups and bodies as well as other journalists and people working in the media.

You may not always be able to control how your research is reported in the media and how the general public talk about it, but you can do more to ensure readers get better access to the actual research. In addition you can do more to ensure that media coverage is picked up by altmetric platforms that will help build a picture of where your research is being discussed. Working with the media is a very valuable and rewarding opportunity to disseminate your research to wider audiences. By adding the checks and balances with links and references you ensure you get to see the long tail of conversation that takes place afterwards. A conversation that you will also be able to engage with and possibly benefit from.


Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.

Creative Commons Licence
This work is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated.

Wednesday, 10 August 2016

Mendeley Masterclasses - Videos to help you master your references

Andy Tattersall
Andy Tattersall has created a new series of short instructional videos to help you make the most out of using Mendeley. The series of 15 videos are called 'Mendeley Masterclass' and help users find their way through the various web, desktop and mobile versions. More videos will be added a later date but you can see for yourself a couple below. Mendeley is a very worthwhile and simple tool to use and really helps academics and students save time and be organised whilst studying and conducting research. The videos are only a few minutes long, so with a cup of tea and half an hour you could master the software easily. The videos are on YoyTube and will be added to the University of Sheffield iTunes U collection later this year.



The Mendeley Masterclass playlist 


Monday, 18 January 2016

Top 10 Articles Featuring Information Resources in 2015 According to Altmetric.com




Andy Tattersall
2015 was a busy year for HEDS researchers, Andy Tattersall looks at the top 10 publications featuring Information Resources staff according to their altmetric score.


Altmetrics are alternative indicators for scholarly reach and creates an altmetric score based on Tweets, Mendeley saves, blog posts, media coverage and Facebook Shares among other indicators. 122 articles were included in the data, which are HEDS publications mentioned in the last year, but not exclusive to 2015. According to the altmetric data IR-related publications were covered in 10 blogs, 22 policy documents, 11 Wikipedia entries, was subject to 1253 Tweets and 27 Facebook shares. The data was gartered between 18th January 2015 to 18th January 2016.

Below is the table to the top 10 with Andrew Booth taking top spot with a paper he co-authored in PLOS. The paper was Tweeted 290 times, saved to Mendeley 49 times, CiteUlike 4 times and blogged once. Not all publications covered in the complete list was published in 2015, but were still communicated and shared in 2015, thus showing the long tail of our research.
The full list of research included in the 2015 export can be viewed here:


IR Altmetrics Chart 2015

Top 10 Articles According to Altmetric.com Featuring IR staff



We can also see from the data the publications IR staff published the most in with HTA Reports taking top spot. As for the journals, Health Information & Libraries Journals published 10 articles.

Top Publications for IR and supported research


Finally we can see the reach of our research on Twitter with 51% of Tweets happening outside of the UK, with 163 North American Tweeters sharing our research, making up for 20% of all IR-related research Tweets in 2015.

Tweets by location for IR-related research








Thursday, 3 September 2015

For what it’s worth – the open peer review landscape

Andy Tattersall
The topic of open peer review has been gaining some traction for some time. It's been a slow burner in the academic community for many years and through the increasing use of social media in research it continues to polarise academic communities. With open research and access increasingly attracting more support, it is inevitable that open peer review will follow that trajectory. How quickly, and with what resistance is not yet known. Andy Tattersall has just published a paper on open peer review and looked at the main protagonists in a special issue focusing on open access for the Online Information Journal.
CC BY 2.0 Tim Morgan


Abstract


The aim of this paper is two-fold, firstly to discuss the current and future issues around post publication open peer review. Secondly to highlight some of the main protagonists and platforms that encourages open peer review, pre and post publication.

The first part of the paper aims to discuss the facilitators and barriers that will enable and prevent academics engaging with the new and established platforms of scholarly communication and review. These issues are covered with the intention of proposing further dialogue within the academic community that ultimately address researchers' concerns, whilst continuing to nurture a progressive approach to scholarly communication and review. The paper will continue to look at the prominent open post-publication platforms and tools and discuss whether in the future it will become a standard model.

The paper identifies several problems, not exclusive to open peer review that could inhibit academics from being open with their reviews and comments of other’s research. Whilst identifies opportunities to be had by embracing a new era of academic openness.




The paper summarises key platforms and arguments for open peer review and will be of interest to researchers in different disciplines as well as the wider academic community wanting to know more about scholarly communications and measurement.
You can find the paper here.
http://www.emeraldinsight.com/toc/oir/39/5

Thursday, 19 March 2015

4 Questions Researchers Need To Ask Before Using The Web To Communicate Their Research

 The idea that all researchers, from early career researchers through to professors are engaging with the Web is pretty much a fallacy. Sure, they may use the web to search for papers, read conference proceedings and respond to funding calls; but beyond email and calendars that’s really about it for the majority. Everett Rogers’ ‘Diffusion of Innovations’ graph showed that innovators and early adopters made up for just a small percentage of those who take up a new idea or technology. So it’s probably quite right that after a decade of the web moving from a read only version to a read-write one that we would be where we are right now with regards to scholarly communication and measurement.
Much has already been written on how researchers can use the many various social, Web 2.0 and altmetric tools to communicate, collaborate and measure their work. Even so, many academics are still not sure about opening the shutters to a wider audience that transcends their institution. For any researcher contemplating using the many tools out there designed to facilitate open research there are a few questions you should ask yourself.
Will I respond to comments?
The web today has given anyone connected to it a voice, and that can be anyone in a variety of guises. Thankfully most communication of research appears to be open and mature. That’s not to say there are not unsavory characters in academia, but trolling happens less in these communities than say something like 4Chan or YouTube. The way to think of it is that online communities are like real world ones, you have some neighbourhoods you wouldn’t venture into after dark.
So posting your work online could result in someone commenting on it. These comments might be agreeable or not, the decision you have to make is whether you want to respond to them. The answer often depends on the comment, as someone may have misquoted or misunderstood your post. They may not agree with it for a good reason and could in turn reveal new evidence for your work. It could even lead to a collaboration later on. However the main thing to remember is not to take it personally, and if it feels personal then look to contact that person outside of a public forum, via email or by telephone. The reason for this is that very few people come out of an online spat looking good. The chances are that this kind of communication is unlikely to happen, although if you are an academic who likes to court controversy read on. 

Am I likely to get into trouble?
University researchers and lecturers have come unstuck using social media as part of their work. Contracts have been cancelled and academics told off for inappropriate comments and language using social media sites such as Facebook and Twitter. It can feel daunting to start using these tools in the knowledge that you could get into hot water over your comments. It’s quite common to see academics use the disclaimer ‘views are my own’ on a profile that links back to their organisation. In the eyes of your institution you are communicating to the world as a representative of them, so that kind of one-liner is unlikely to be a legal defence and prevent at best, a telling-off when you Tweet something objectionable about them or their students.
It’s important to remember that your professional and personal profiles can cross over using these tools. There is nothing wrong talking about your views and interests interspersed with your research, it adds an informal and human touch. There is also the option of keeping the two worlds separate, Facebook for homelife, ResearchGate for work. The main thing is to think before you Tweet or write that blog post that there is nothing wrong with stirring debate and proposing left-field ideas; after all that’s how a lot of research starts. Just remember that like driving, you shouldn’t drink and Tweet… well don’t drink too much. That what goes on the web, stays on the web and that what you think is a private comment is easily sharable by your contacts. Just remember to ‘Tweet Responsibly’.

Do you have the time?
This is a common question, especially as all this communicating and measuring is just extra stuff to do, isn’t it? The reality is that many of the tools emerging around scholarly communication take a lot less time than you think. Take this blog post for example, I had the idea to write it at a conference and managed to write it in about 50 minutes. Given it’s a topic in my area of interest and expertise and that I think it could help some people I regarded it as a good use of that time. Social networking tools can be very time consuming if you let them get unwieldy and disorganised. So using tools like TweetDeck or Hootsuite for your Twitter account and such as Figshare and Altmetric.com to host and measure your outputs can reduce the labour involved for such tasks. By engaging with these tools you start to see other benefits, such as who is reading and sharing your research. Also it creates a personal learning network, that potentially leads to collaborations and the discovery of fresh ideas and research. If you are a researcher very much tied to a set way of working and feel that these tools will disrupt your flow there are things you can do. As with any distraction such as email, you can set parts of your day aside to engage with these technologies. Also, you can engage with the scholarly community using your smartphone and tablet if you have one, which allows you to work whilst in transit, at a conference or in between meetings. These technologies do not have to rule your work, if used correctly they can enhance it.
Am I sure I can mention my work?
There are times when a researcher cannot mention their work and this can often depend on who is funding it. It may be that the work is funded by a private company or industry and sharing it could be a breach of a research contract. It may be time sensitive or have an embargo on it, there is also the issue of it being incomplete. All of these concerns are quite legitimate, and it is always a good idea to check first if you are unsure. Nevertheless, there is always something in your field of research you can discuss and share across the Web. You may have datasets and publications that are free to be shared and discussed. Even by starting a blog to discuss your topics of interest you are putting a communication out to your research community. It might be just to discuss new research you have discovered or promote previous work that is now available for open access. Not everyone can just plough ahead and do this, and some researchers may seek guidance from a senior academic. It is helpful to see what other colleagues and departments in your institution are doing with regards to scholarly communication. If there are any seminars or training events try to get along to them as you are sure to find new ways of sharing your research as well as new contacts.
The main thing to remember that working the way you have always done may bring positive results, but they will be delivered the same way via publication and conference. It reminds me of a quote by the brilliant Australian cricket spin bowler Shane Warne, when asked about the England bowler Monty Panesar. Warne was asked about Monty’s record and Shane replied that he was one-dimensional player with no variation. Warne said: “Monty Panesar hasn’t played 33 Tests, he’s played 1 Test 33 times”. By entertaining some of these tools you could discover new ways of working and communicating your research and snaffle a few extra wickets in the process.

This blog post was originally published on the Digital Science Blog
http://www.digital-science.com/blog/guest/4-questions-researchers-need-to-ask-before-using-the-web-to-communicate-their-research/