Thursday, 8 December 2016

Identifying evidence for economic models - notes from a workshop

Mark Clowes
Mark Clowes
Mark Clowes recently attended one of ScHARR's short courses, a one-day workshop entitled "Identification and review of evidence to inform cost-effectiveness models".

I've only been working in health technology assessment for 18 months but the big difference from previous roles I've held is that I'm often working alongside colleagues from very different professional backgrounds; and I hoped that this course would help me to understand their work better and see where my contribution fits in.

The participants were from a wide variety of backgrounds: technology assessment centres, university, pharmaceutical companies and private consultancies.

Suzy Paisley explained how during her time at ScHARR she had progressed from searching for clearly focussed systematic review questions of clinical effectiveness to the infinitely more complicated universe of economic models.  Models typically involve a wide range of different parameters in an attempt to reflect the complexities of real life, and identifying evidence for them is therefore much less "black and white".  Rather than producing a single comprehensive search strategy to find all the evidence, a model is likely to draw on many different types of evidence from different perspectives; and while a systematic approach is still required, it would be impossible for an information specialist to find (or a modeller to use) ALL the evidence.  Instead, transparent judgments should be made about what is included or excluded; there is no perfect model, but a "good" model will be explicit about the choices and decisions and sources of evidence which have informed it.

Paul Tappenden gave us an introduction to modelling, beginning by quoting George Box's famous maxim that "all models are wrong... but some are useful".   He argued that any model should always begin with a conceptual stage, at which decisions are made about the disease logic model (considering the "natural history" of the disease and taking into account factors such as the likelihood of progression, different risk groups etc.); the service pathway model (the patient's journey through different stages of treatment - which may be subject to geographical variations) and the design-oriented model (what type of model will best address the decision problem?  This may also be influenced by the availability of evidence and the previous experience of the modeller).

A group exercise in which we attempted some conceptual modelling around a topic quickly made us realise the complexity involved as, in order to calculate whether a fictional drug was cost effective, we would need a wealth of information: not only the obvious (evidence of its clinical effectiveness and cost) but information on its possible adverse effects to be weighed up against quality of life studies of patients living with the condition; information on resource use (cost of administering comparator treatments / best supportive care) and mortality (indicating how many years it would be likely that the treatment would have to  be provided.

Some of these data are unlikely to be found in traditional trials, and so over lunchtime we were given worksheets to explore alternative sources (including disease registries, statistics and official publications, and our own ScHARR-HUD database for studies around health utilities).

The most challenging part of searching for evidence for economic models may be deciding when to stop.   How much evidence is enough, and how comprehensive is it necessary to be when you may need to conduct multiple miniature reviews to answer one main question?   I know from personal experience critiquing the searches run to inform manufacturers' economic models submitted for NICE appraisal how contentious this topic can be, but in a recent paper for the journal PharmacoEconomics, Suzy has attempted to define a "minimum requirement" for this type of search.

The final session of the day came from Prof. Eva Kaltenthaler, who heads the Technology Assessment Group at ScHARR.   Eva helped us understand how reviewers make judgements about which identified studies to include.   Frequently there is a tension between researchers' desire to be thorough and comprehensive in their coverage, and the needs of the decision makers who commission the review for the results to be delivered in a short time-frame.  Where this is the case, rapid review methods may be called for.   This might mean prioritising certain selection criteria over others, although which are deemed most important will depend on the context.  In some cases the geographical setting of retrieved studies may determine how relevant they are; in others the study type, or the cohort size.

Overall this was a useful and thought-provoking day, although for any librarians/information specialists who thought they had already mastered comprehensive searching, there was some "troublesome knowledge" to take on board.   As we work more closely alongside researchers we understand better that they don't want to be overwhelmed with mountains of evidence; they want to ensure all perspectives are covered but to avoid wasting time on studies which do not make any difference to the final decision.  How information specialists can best support this information need remains a challenging question.  Will the boundaries will become blurred between our role in finding information and that of reviewers in sifting and evaluating it?  Are those of us without a previous background in medicine, economics or statistics (and let's be honest, very few of us are knowledgeable about all three) able to acquire sufficient skills in those disciplines to succeed in these shifting roles?


*NEWSFLASH* This course will be running again on 23rd March 2017 - find out more / book a place or see other short courses available from ScHARR.


Read Suzy Paisley's PharmacoEconomics article (2016): "Identification of Evidence for Key Parameters in Decision-Analytic Models of Cost Effectiveness: A Description of Sources and a Recommended Minimum Search Requirement"


No comments: