EXPERT ELICITATION- CAN VARIATION IN ELICITED DISTRIBUTIONS BE EXPLAINED BY EXPERT HETEROGENEITY?

Author(s)

Jankovic D
University of York, York, UK

OBJECTIVES: Expert elicitation has been proposed as a method for quantifying uncertainty in decision models when evidence from studies is unavailable. Elicitation refers to a formal, structured process of extracting experts’ beliefs in probabilistic form. Elicited distributions can vary depending on who we ask and how we ask them. Credibility of elicited distributions relies on strict processes and transparency, yet guidelines on how to select appropriate experts are vague. This paper explores how perception of uncertainty varies between experts and how the choice of expert can affect estimates of uncertainty. METHODS: A literature review was conducted to identify measurable indicators of expertise. Beliefs about a number of parameters were then elicited from a heterogeneous sample of experts to explore how the identified indicators of expertise affect experts’ beliefs. Two aspects of their distributions were compared: confidence and the consistency of elicited distributions with those observed in a clinical trial, unknown to experts. RESULTS: Years of experience, numeracy, adaptive thinking scores, research experience and time spent with relevant patients were identified as potential indicators of expertise. Results of the elicitation exercise suggested that experts with limited quantitative skills were more likely to provide implausible distributions (e.g. mode outside the stated range). Experts with no research experience were more prone to overconfidence than experts with research experience. Determinants of consistency with trial results were different when eliciting parameters that have been observed by experts, compared to those that require predicting unknown quantities. CONCLUSIONS: Variation in elicited distributions could potentially be explained by expert heterogeneity. The effect of heterogeneity depends on the type of parameter. The optimal method of eliciting quantities should complement the experts involved. Teaching skills required to accurately express opinion in probabilistic form may not always be feasible; when this is the case experts with existing quantitative skills may be preferred.

Conference/Value in Health Info

2016-10, ISPOR Europe 2016, Vienna, Austria

Value in Health, Vol. 19, No. 7 (November 2016)

Code

PRM74

Topic

Methodological & Statistical Research, Real World Data & Information Systems, Study Approaches

Topic Subcategory

Modeling and simulation, Reproducibility & Replicability

Disease

Multiple Diseases

Explore Related HEOR by Topic


Your browser is out-of-date

ISPOR recommends that you update your browser for more security, speed and the best experience on ispor.org. Update my browser now

×