Stated preference methods rely on responses to hypothetical questions. A enduring question is the extent to which individuals behave in reality as they state in hypothetical tasks.

Our research in this area uses both field and laboratory experiments to assess hypothetical bias in stated preference responses and the potential of ex-ante and ex-post bias corrections.

Completed Projects

Demand revelation in a multi-attribute discrete choice task

The efficient provision of healthcare requires information about the costs and benefits of interventions. While many costs and benefits can be measured using market prices, many healthcare interventions are not traded in markets and consequently have no market price. In this instance, stated preference methods such as DCEs may be used to estimate the benefits.

Few studies have compared hypothetical discrete choice experiment (DCE) responses with the equivalent real choices. Even when hypothetical choices and real choices are compared, it is impossible to conclude that choice differences result from the question type (hypothetical or real) and not preferences differences across the two groups. Techniques developed in experimental economics can disentangle the influence of choice context from individuals’ preferences. Using induced value experiments, this project compared hypothetical and real DCE responses, estimated the magnitude of hypothetical bias, and considered whether models of bounded rationality better explain responses to DCEs.

Our results indicate that DCEs do not reliably measure individuals’ true valuation of the good. We find little evidence that individuals make different choices in hypothetical and incentivised settings. We find that choice complexity affects individuals’ ability to make correct choices and that individuals learn as they complete the task.

Outcome and Translation

Our results imply that researchers must take choice complexity and learning into account when designing stated preference studies.

HERU researchers involved in this research project: Verity Watson

External collaborators: S Luchini (GREQAM, Marseille)

 

Do participants understand health economics surveys?

Debriefing questions can assess if respondents understand discrete choice experiments (DCEs) and are answering in a way consistent with theories of decision making and utility maximisation. However, there is limited literature about how often debriefing questions are included or how the results are used in health economics. We conducted an online survey of authors of published health DCEs, asking about their use of debriefing questions, including frequency, type and analysis. We descriptively analysed the sample characteristics and responses.

Outcome and Translation

These results suggest that while over half of researchers conducting health DCEs use debriefing questions, many do not analyse, use or report the responses. Given the additional respondent burden, there is a need for reliable and valid debriefing questions. In the meantime, the inclusion, analysis, and reporting of debriefing questions should be carefully considered prior to DCE implementation.

HERU researchers involved in this research project: Verity Watson

External collaborators: Pearce, A. (University of Sydney); Mulhearn, B. Viney, R. (CHERE, University of Aberdeen)

 

Does an oath improve demand revelation in discrete choice experiments?

This project considered the potential of ex ante corrections to improve demand revelation and reduce hypothetical bias in discrete choice experiments. Theories of social psychology emphasise that social context is important when individuals are asked to value non-market goods. We used an induced value experiment to investigate demand revelation and hypothetical bias in discrete choice experiment responses. This project drew on existing research of Drs Luchini and Jacquemet and Professor Shogren by asking participants to complete one of three oaths before taking part in the experiment. Commitment theory suggests that this should increase demand revelation and reduce hypothetical bias.

Outcome and Translation

We find that an oath improves the reliability of responses to hypothetical DCE-type tasks. We find oaths that target honesty are more effective than those that target effort.

HERU researchers involved in this research project: Verity Watson

External collaborators: N Jacquemet (University of Paris); S Luchini (GREQAM, Marseille) and J Shogren (University of Wyoming)

 

External validity of contingent valuation: a field experiment comparing hypothetical and real payments

Whilst willingness to pay (WTP) is increasingly used in economics to value benefits, questions remain concerning its external validity, i.e. do hypothetical responses match actual responses? We present results from two within sample field tests (Thailand and Scotland). Our Thailand experiment suggests that whilst Hypothetical No responses are always a No, Hypothetical Yes responses exceed Actual Yes responses. Certainty calibrations (verbal and numerical response scales) minimise hypothetical-actual discrepancies offering a useful solution.

Failure to adjust may overstate monetary measures of value, and lead to inaccurate policy recommendations. In a follow-up study conducted in Scotland, qualitative investigation is currently taking place into why people do not behave in reality as they state in contingent valuation surveys.

HERU researchers involved in this research project: Mandy Ryan, Sebastian Heidenreich

External collaborators: J Cairns (Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine), S Jareinpituk (Department of Dental Public health, Mahidol University), E Mentzakis (Economics Department, University of Southampton) D Glynn (University of York) and C Bond (Academic Primary Care, University of Aberdeen)

 

Task complexity and response certainty in discrete choice experiments

This study explores the behavioural and statistical links between utility balance and cognitive burden in discrete choice experiments (DCEs) by examining the relationship between respondents’ stated certainty about their DCE responses and the statistical precision of the econometric model. DCE experimental design emphasises utility balance across choice task alternatives, but this increases task complexity and respondents’ cognitive burden. A consequence of task complexity is response error, which increases response variability and decreases statistical efficiency.

Outcome and Translation

We find that increases in choice task utility balance decreases response certainty, and re-weighting the regression to favour respondents who are more uncertain of their choices increases the statistical precision of the econometric model.

HERU researchers involved in this research project: Verity Watson

External collaborators: H Burnett, W Ungar (The Hospital for Sick Kids, Toronto, Canada) and D Regier (University of British Columbia, Canada)

 

PhD: Assessment of the external validity of discrete choice experiments: an application in pharmacy

The discrete choice experiment (DCE) technique has been applied extensively in the valuation of healthcare benefits to capture preferences. A key methodological question is if external validity i.e. the extent to which respondents’ choices made in a hypothetical DCE context, truly reflect their actual preferences. This thesis explored the external validity of DCEs within a pharmacy context. The thesis compared what respondents said they would do in a DCE survey with what they actually did when presented with the same scenario in real life. The thesis also explored the roles of uncertainty and attitudes in explaining discrepancies and uised qualitative research to provide insight.

The DCE correctly predicted 42.1% of participants’ actual choices. Calibration of the DCE with certainty questions and incorporation of the TPB into DCE improved DCE-prediction. Reasons for discrepancies in stated choice and actual behaviour included differences in decision-making processes in DCE and real life, attitudinal and other contextual factors (e.g. timing, location).

Outcome and Translation

Calibration methods should be considered to improve external validity. The development of DCEs, and the modelling of choice responses, should mimic as closely as possible the decision-making process individuals face in reality. This thesis extends the limited pool of empirical studies assessing external validity of DCEs.

PhD student: Gin Nie Chua

Supervisors: Mandy Ryan (HERU); T Porteous (HSRU, University of Aberdeen) and C Bond (Academic Primary Care, University of Aberdeen)