Showing posts with label HEDs. Show all posts
Showing posts with label HEDs. Show all posts

Thursday, 7 March 2019

HEDS Up (and out)

You may notice that this blog has been quiet for a while, and that's because the IR team has decided that from now on we'll be posting on the HEDS blog instead.

HEDS (Health Economics and Decision Science) is the largest section of ScHARR, and the one in which the Information Resources team is located (though we continue to work with the other sections of the School as well).

We're still the same team, interested in the same things, like: information retrieval; research impact;  review methodology; and of course libraries - but from now on we'll be blogging about them in the same place as the rest of our HEDS colleagues.

For a taste of the HEDS blog, why not check out this recent post by IR's Anthea Sutton on search strategies for identifying systematic review tools?

Thanks for all your interest in the ScHARR Information Resources blog since its inception way back in 2007(!) - the archive will remain here for the forseeable future should you wish to access any older posts,  but for the latest news from the team we hope you'll continue to follow us in our new home.


Thursday, 16 February 2017

Gone In 60 Seconds

Mark Clowes
Mark Clowes
Mark Clowes braves the tough crowd that is the HEDS Section Meeting to give a one-minute presentation about his work.

One of the stranger things about working in an academic school rather than a traditional library is the departmental meeting, where you can find yourself sitting alongside people whose jobs have very little in common with your own.

The ScHARR Information Resources team sit within the Health Economics and Decision Science section of ScHARR, surrounded by systematic reviewers, economists, modellers and statisticians.   To help these different professional groups understand each other better, section meetings begin with quickfire one-minute presentations from each group known as "Gone In 60 Seconds".

As a Nicolas Cage fan (I even watch his really bad films, and God knows there are plenty) - and, not least, because it was my turn - I agreed to take part. But what would be "gone" in those 60 seconds?  My career prospects?  My credibility with colleagues (if I ever had any)?  Would I be challenged for hesitation, repetition or deviation?

I enjoy giving presentations and don't usually get too nervous; but the audience for this one (and the timing, with the expiry date of my contract approaching) made me particularly keen to impress. 

I decided to give my first airing to a topic about which I hope to present at one or two conferences in the summer - using a text-mining and data visualisation app (VOS Viewer) to deal with a large number of references retrieved by a systematic review.  I chose this topic to demonstrate to colleagues that IR staff were continuously experimenting with new technology and ways of working, and - since it has the potential to influence the scope of future review projects - because it would have relevance to all the different groups in the room.  An added bonus was that I could display some pretty images of the "heat maps" produced by VOS Viewer on the screen, which would take the audience's eyes off me.

The short format required more preparation than usual - generally I don't like to work from a script, preferring to maintain a conversational tone and improvise around bullet points - but my initial attempts to do so on this topic ran significantly over time.

In the end, I realised I was going to have to write out what I wanted to say in full - initially using free writing with pen and paper, then gradually refining and paring it down until I could beat the kitchen timer countdown (this was one of those tasks I could only have done working at home - colleagues would think I had lost the plot walking around reciting the same presentation over and over again).

I didn't want it to be a dry, technical presentation (in any case, there wasn't enough time to explain in depth how the software worked) so instead came from the angle of "why is this useful?" - i.e. for dealing with a common problem of facing too many references to sift in the traditional way, but potentially too important to ignore.

On the day, I think it went pretty well - people seemed engaged with what I was saying, although a slight technical hitch with my slides meant that I didn't quite manage my closing sentence before I (5...) was (4....) ruthlessly (3...) cut (2...) off (1...)


Monday, 31 October 2016

ScHARR Information Resources Group at ISPOR European Congress

Image credit: Viennascape by Nic Piégsa
. This work is licensed under a CC BY 2.0 license.

This week, ScHARR Information Resources Group will be represented at the ISPOR 19th European Annual Congress by Suzy Paisley (Director of Innovation and Knowledge Transfer) and Anthea Sutton (Information Resources Group Manager).

Along with other members of the HEDS team, Suzy and Anthea are contributing to the busy conference programme with a poster presentation each.  Suzy on the topic of "Identifying Early Biomarkers of Acute Myocardial Infarction in the Biomedical Literature: A Comparison of Text Mining and Manual Sifting Techniques".  Anthea will be representing the Systematic Review Toolbox, in collaboration with Chris Marshall from YHEC.  Both posters can be viewed between the hours of 08:45 and 14:15 on Monday 31st October, with an author discussion hour at 13.15, do come along and find out about this work.

For the remainder of the congress, Suzy and Anthea can be found, along with the rest of the HEDS expert team, at exhibition Stand no. 20/21.  This provides a great opportunity to find out about opportunities to work with ScHARR to support key strategic developments in your organisation, to collaborate with us on research and to participate in our diverse, world class learning and teaching programmes.

The full ScHARR HEDS congress activity can be found here. We'll be tweeting from @ScHARR_IKT during the congress, so please follow us there.

Wednesday, 12 October 2016

The number behind the number: suggesting a truer measure of academic impact

A close colleague and friend of Information Resources, Dr Chris Carroll has just published a paper on the limitations of citations, in particular the single counting of them within a paper's reference list. Chris wrote about his paper in the excellent LSE Impact Blog and we thought it would be of interest to visitors of the Information Resources Blog. We have republished it under their Creative Commons licence. Dr Carroll is a member of HEDS and worked previously as an information specialist as part of our team.
The limitations of simple ‘citation count’ figures are well-known. Chris Carroll argues that the impact of an academic research paper might be better measured by counting the number of times it is cited within citing publications rather than by simply measuring if it has been cited or not. Three or more citations of the key paper arguably represent a rather different level of impact than a single citation. By looking for this easily generated number, every researcher can quickly gain a greater insight into the impact (or not) of their published work.
The academic research and policy agenda increasingly seeks to measure and use ‘impact’ as a means of determining the value of different items of published research. However, there is much debate about how best to define and quantify impact, and any assessment must take into account influence beyond the limited bounds of academia, in areas such as public policy. However, within academia, it is generally accepted that the number of times a paper is cited, the so-called ‘citation count’ or ‘citation score’, offers the most easily measured guide to its impact. The underlying assumption is that the cited work has influenced the citing work in some way, but this metric is also viewed as practically and conceptually limited. The criticism is loud and frequent: such citation scores do not tell us how a piece of research has actually been used in practice, only that it is known and cited.
Image credit: beyond measure by frankieleon. This work is licensed under a CC BY 2.0 license.
Is there a better solution?
So, the obvious answer is to see if there is an easy way to measure how a paper has been used by its citing publications, rather than simply recording whether it is cited (the standard ‘citation score’). I did this by using one of my own frequently-cited papers as a case study and applying a basic metric: impact of the case study paper was considered to be ‘high’ if it was cited three or more times within the citing publication; ‘moderate’ if cited twice; and ‘low’ if cited only once.
The case study paper had 393 unique citations by November 2015: 59% (230/393) of publications cited the paper only once, suggesting its impact on these publications was low; 17% (65/393) cited it twice, suggesting moderate impact; but 25% (98/393) cited it three or more times, suggesting that this paper was having a genuine influence on those studies. The citation frequency within this ‘high impact’ group of publications ranged from three to as many as 14 in peer-reviewed studies and 17 in academic dissertations (with their longer word count). Primary research studies published in peer-reviewed journals were the principal publication type across all levels of impact.
I also noted where these single or multiple citations appeared within these publications. Single citations tended to appear only in the introduction or background sections of papers. These instances appeared to be simple citations ‘in passing’, the necessary ‘nod’ to existing literature at the start of a publication. However, when there were three or more citations, they tended to appear across two or more sections of a paper, especially in the methods and discussions, which suggests some real influence on the justification or design of a study, or the interpretation of its results. These findings on the location of the citations confirmed that the number of times a paper was cited within a publication really was a good indicator of that paper’s impact.
Pros and cons
A metric based on within-publication citation frequency is unambiguous and thus capable of accuracy. The assessment of where the citations appear in the citing publications also provided contextual information on the nature of the citation. In this case, it confirmed the viability of within-publication citation frequency as an impact metric. The approach is transparent and data can be verified and updated. The proposed metric, the ‘citation profile’, is not a measure of research quality, nor does it seek to measure other forms of impact or to address issues such as negative findings producing relatively lower citation scores. It merely seeks to contribute to the debate about how citation metrics might be used to give a more contextually robust picture of a paper’s academic impact.
Of course, even this ‘citation profile’ is not without its problems, but it is arguably less of a“damned lie” or “statistic” than some other metrics: only a more in-depth analysis of each citing publication can more accurately gauge a paper’s influence on another piece of research or a policy document. Yes, there are also questions concerning this metric’s generalisability: as with other bibliometrics, publications in different disciplines are also likely to have different citation profiles. However, these issues can be easily addressed by future research, given the simplicity of the metric.
If academics or funders want to improve their understanding of how research is used, then this easy-to-generate metric can add depth and value to the basic, much-maligned, and ‘damned’ ‘citation score’.
This blog post is based on the author’s article, ‘Measuring academic research impact: creating a citation profile using the conceptual framework for implementation fidelity as a case study’, published in Scientometrics (DOI: 10.1007/s11192-016-2085-0).
Note: This article gives the views of the author(s), and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the author
Chris Carroll is a Reader in Systematic Review and Evidence Synthesis at the University of Sheffield. His role principally involves synthesising published data to help inform policymaking, particularly in the fields of medicine and health, as well as the development of methods to conduct such work.
Print Friendly

Friday, 5 November 2010

HEDS Get Blogging!


Posted by Andy

For those unaware The ScHARR Library is nicely nestled in a much larger organism under the name of Health Economics and Decision Science (HEDS). With the aid of my good self, Professor Simon Dixon has launched HEDS first venture in the realm of blogging with a site that has regular interesting posts about what we do - all of which don't extend beyond 200 words. So no long reams of text, it's all straight to the point, and considering how the average attention span has shrunk to a.........you've stopped reading this haven't you?

More about HEDS
Health Economics and Decision Science (HEDS) is part of the School for Health and Related Research (ScHARR) at the University of Sheffield.  It comprises of around 80 research staff with expertise in economics, mathematical modelling, systematic reviewing, Bayesian statistics and information sciences.  HEDS is built around reserach funding from several national sources and consultancy from a range of NHS and pharmaceutical organisations.  It runs one Masters degree and its staff teach modules on several others.  In the 2008 UK Research Assessment Exercise, ScHARR was ranked number 1 for research power.


Thursday, 7 January 2010

The World's Biggest Snow Man




Photo and posted by Andy

OK, Maybe not the world's biggest, but certainly the biggest in Central Sheffield. This overwieght chappie (I presume it's a man, they usually are) was created by a half-dozen of HEDS Researchers who decided to show off their creative side in their lunch break.
ScHARR's latest recruit will no doubt be with us for the next few days whilst this cold snap continues, or until the porters remove him from the building.


Tuesday, 12 May 2009

Using Web2.0 to Aid Research and Collaboration

Photo by blogefl
Posted by Andy

The next HEDS lunchtime seminar is on Wednesday
the 13th May at 12:30-1:30 in Lecture Rooms 1 &2.
Andy Tattersall
will be presenting on "Using Web2.0 to Aid Research and
Collaboration."
The seminar is part of the library open day, which is
open to all ScHARR staff and PhD students.

What is Web2.0?


Wednesday, 7 January 2009

HEDS Seminar Programme Spring/Summer 2009


Photo by span
Posted by Andy

The HEDS Seminar Programme for Spring and Summer 2009 have been released and are listed below. The details are:

Date and time
Speaker
Title
Venue

Thursday 12 February 2009
12.30 – 1.30pm
Sean Sullivan University of Washington
TBA
Lecture rooms 1 and 2, Regent Court

Thursday 19 March 2009
12.30 – 1.30pm
Michael Pidd Lancaster University
DGHPSim: smart simulation approaches for investigating healthcare policy as well as developing better practice
Lecture rooms 1 and 2, Regent Court

Thursday 23 April 2009
12.30 – 1.30pm
Mila Petrova University of Warwick
Searching electronic databases for publications on health-related values
Lecture rooms 1 and 2, Regent Court

Thursday 21 May 2009
12.30 – 1.30pm
Dr Ruth Garside
PenTAG, Universities of Exeter and Plymouth
TBA
Lecture rooms 1 and 2, Regent Court

Thursday 18 June 2009
12.30 – 1.30pm
Andrew Jones
University of York
TBA
Joint HEDS and Economics seminar
Lecture rooms 1 and 2, Regent Court


Thursday 16 July 2009
12.30 – 1.30pm
Amanda Burls
University of Oxford
The ethics of prioritisation in HTA
Lecture rooms 1 and 2, Regent Court



For further information please contact Donna Rowen D.Rowen@sheffield.ac.uk or John Brazier J.E.Brazier@sheffield.ac.uk

Monday, 3 November 2008

Information Resources Join HEDS

Photo by by roland
Posted by Andy


As part of a University-wide reorganisation, Information Resources (that's us) have merged into our fellow ScHARR section of HEDS (Health Economics and Decision Science). You'll still find us doing the same work and in the usual places, as well as a few new ones over the next few months. It's very exciting times for us as we take the good ship IR and head off to a few fresh waters with a whole new bunch of wonderful people. Rumours that there was a transfer fee involved have not yet been clarified, although I'm still hoping for a sponsorship deal with CILIP.



For those of you unaware of who Information Resources and HEDS are, here's the lowdown.


Information Resources

Since 1994, Information Resources has established itself as a key national player in providing information support to health technology assessment and health services research. This has been complemented by our regional role in delivering information skills training to NHS staff and health librarians through the RDSU Information Service @ ScHARR Library.


HEDS

The purpose of HEDS is to promote excellence in national and international health care resource allocation decisions, through applied and theoretical research funded by the public and private sector; and supporting the effective implementation of the results of such research through education, training and management interventions.With a broad portfolio of specialities, HEDS makes major contributions in many areas, including the valuation of health, the analysis of health policy, welfare and equity, technology appraisal, trial-based economic evaluation, and econometrics.

Monday, 15 September 2008

HEDS seminar programme for Autumn 2008

Photo by by hiddedevries
Posted by Andy

Thursday 9th October 12.30-2pm


Benjamin Craig
University of South Florida
and Jan J. V. Busschbach
Erasmus MC and Viersprong Institute for Studies on Personality Disorders

Removing Bias from United Kingdom Values of EQ-5D States

Lecture rooms 1 & 2, Regent Court


Thursday 20th November
12.30-2pm


Graham Mowatt
University of Aberdeen


Indirect and mixed treatment comparisons in NICE Interventional Procedures Reviews

Lecture rooms 1 & 2, Regent Court

Thursday 4th December
12.30-2pm


Richard Edlin
University of Leeds

TBA

Lecture rooms 1 & 2, Regent Court


Thursday 11th December
12.30-2pm

Raymond Pawson University of Leeds

Reducing plague by drowning witches: On the importance of understanding and evaluating the mechanisms that generate behavioural change in public health interventions

Room F41, Hicks Building

For further information please contact Donna Rowen D.Rowen@sheffield.ac.uk or John Brazier J.E.Brazier@sheffield.ac.uk

Friday, 11 July 2008

HEDS Seminars

Photo by Charlie Dave
Posted by Andy

The remaining HEDS seminars for this semester:


Joint HEDS and Economics seminar


Speaker: Nigel Rice, University of York

Title: Does health care spending improve health outcomes?

Evidence from English Programme Budgeting data

Date: Thursday 17th July Time: 12.30-1.30pm

Venue: Lecture rooms 1 and 2, Regent Court


Speaker: Ignacio Abasolo, Universidad de la Laguna, Tenerife

Title: To be confirmed

Date: Thursday 31st July Time: 12.30-1.30pm

Venue: Lecture rooms 1 and 2, Regent Court


Tea, coffee and biscuits will be provided.

Thursday, 13 September 2007

Health Economics and Decision Science - Seminar Programme Autumn 2007


photo by jobonipc

You can tell when it's the end of the summer when the football season starts, party political conferences appear and the Health Economics and Decision Science Autumn Seminar Programme returns.

We have some great speakers and encourage you to attend.

Karl Claxton

University of York
PPRS is dead: long live value based pricing!
Thursday 20th September 12.30-1.30pm
Lecture theatres 1 and 2, Regent Court


Hugh Gravelle

University of York(Joint seminar with Economics)
Wednesday 17th October 4.00-5:30pm
Room 118, Management Building


Joanna Coast

University of Birmingham
Assessing capability in health care: philosophy & methodology
Thursday 15th November 12.30-1.30pm Lecture theatres 1 and 2, Regent Court

For further information please contact Donna Rowen d.rowen@sheffield.ac.uk or John Brazier j.e.brazier@sheffield.ac.uk.