Assessing Payers’ Preferences for Real-World Evidence in the United States: A Discrete Choice Experiment

Published:October 30, 2021DOI:


      • Real-world evidence (RWE) can provide valuable information to inform coverage and reimbursement decisions for payer organizations.
      • Researchers conducting RWE studies might not be providing the type of evidence that payers seek to inform coverage and reimbursement decisions, causing misalignment.
      • This article explores the payers’ preferences for RWE studies and provides the first quantification of the value payers place on key RWE attributes relative to each other in terms of importance and marginal willingness-to-pay.
      • This study created a framework to elicit the preferences of payers for attributes of RWE studies when assessing evidence to make formulary decisions for chronic disease treatment.
      • When assessing RWE studies, we found that payers value the clinical and health-related quality of life over methodologic rigor, resource utilization, external validity, and productivity outcomes.
      • Our results can be used to guide future RWE research priorities that will best inform payer decision making and will be better aligned with their preferences.
      • The methodology and instruments developed can be replicated for different therapeutic conditions.



      To rank the US payers’ preferences for attributes of real-world evidence (RWE) studies in the context of chronic disease and to quantify trade-offs among them.


      We conducted a discrete choice experiment in which 180 employees from payer organizations were tasked to choose between 2 RWE studies assuming they were assessing evidence to inform formulary decisions for chronic disease treatment. Each RWE study was characterized by 7 attributes with 3 levels each: very informative, moderately informative, and not measured. We used a D-optimal main-effects design. Survey data were fitted to a conditional logit model to obtain a relative measure of the ranking of importance for each attribute.


      Clinical outcomes were the most preferred attribute. It was 4.68 times as important as productivity outcomes—the least preferred attribute. It was followed by health-related quality of life (2.78), methodologic rigor (2.09), resource utilization (1.71), and external validity (1.56).


      This study provides a quantification of the value payers place on key RWE attributes. Across attributes, payers have higher preferences for clinical and health-related quality of life outcomes than the other attributes. Between attributes’ levels, payers prefer high levels of information in clinical outcomes and methodologic rigor but are indifferent in other attributes. Our results bridge the gap between the information that payers seek and the attributes that RWE studies prioritize and effectively guide future research design.


      To read this article in full you will need to make a payment


      Subscribe to Value in Health
      Already a print subscriber? Claim online access
      Already an online subscriber? Sign in
      Institutional Access: Sign in to ScienceDirect


      1. Framework for FDA’s real-world evidence program. Food and Drug Administration.
        Date accessed: January 11, 2021
        • Corrigan-Curay J.
        • Sacks L.
        • Woodcock J.
        Real-world evidence and real-world data for evaluating drug safety and effectiveness.
        JAMA. 2018; 320: 867-868
        • Makady A.
        • de Boer A.
        • Hillege H.
        • Klungel O.
        • Goettsch W.
        What is real-world data? A review of definitions based on literature and stakeholder interviews.
        Value Health. 2017; 20: 858-865
        • Nabhan C.
        • Klink A.
        • Prasad V.
        Real-world evidence—what does it really mean?.
        JAMA Oncol. 2019; 5: 781
        • Alemayehu D.
        • Ali R.
        • Alvir J.
        • et al.
        Examination of data, analytical issues and proposed methods for conducting comparative effectiveness research using “real-world data”.
        J Manag Care Pharm. 2011; 17: 1-37
        • Berger M.L.
        • Lipset C.
        • Gutteridge A.
        • Axelsen K.
        • Subedi P.
        • Madigan D.
        Optimizing the leveraging of real-world data to improve the development and use of medicines.
        Value Health J Int Soc Pharmacoecon Outcomes Res. 2015; 18: 127-130
        • Malone D.C.
        Real-world evidence enhances decision making.
        J Manag Care Spec Pharm. 2020; 26: 1612-1614
        • Roberts M.H.
        • Ferguson G.T.
        Real-world evidence: bridging gaps in evidence to guide payer decisions.
        Pharmacoecon Open. 2021; 5: 3-11
        • Leung M.Y.
        • Halpern M.T.
        • West N.D.
        Pharmaceutical technology assessment: perspectives from payers.
        J Manag Care Pharm. 2012; 18: 256-264
        • Hurwitz J.T.
        • Brown M.
        • Graff J.S.
        • Peters L.
        • Malone D.C.
        Is real-world evidence used in P&T monographs and therapeutic class reviews?.
        J Manag Care Spec Pharm. 2017; 23: 613-620
        • Perfetto E.M.
        • Anyanwu C.
        • Pickering M.K.
        • Zaghab R.W.
        • Graff J.S.
        • Eichelberger B.
        Got CER? Educating pharmacists for practice in the future: new tools for new challenges.
        J Manag Care Spec Pharm. 2016; 22: 609-616
        • Jiao B.
        • Veenstra D.L.
        • Lee W.
        • Carlson J.J.
        • Devine B.
        The use of real-world evidence in ICER’s scoping process and clinical evidence assessments.
        J Manag Care Spec Pharm. 2020; 26: 1590-1595
        • Lee W.
        • Dayer V.
        • Jiao B.
        • Carlson J.J.
        • Devine B.
        • Veenstra D.L.
        Use of real-world evidence in economic assessments of pharmaceuticals in the United States.
        J Manag Care Spec Pharm. 2020; 27: 5-14
        • Malone D.C.
        • Brown M.
        • Hurwitz J.T.
        • Peters L.
        • Graff J.S.
        Real-world evidence: useful in the real world of US payer decision making? How? When? And What Studies?.
        Value Health. 2018; 21: 326-333
        • Deverka P.A.
        • Douglas M.P.
        • Phillips K.A.
        Use of real-world evidence in US payer coverage decision-making for next-generation sequencing–based tests: challenges, opportunities, and potential solutions.
        Value Health. 2020; 23: 540-550
        • Wang A.
        • Halbert R.J.
        • Baerwaldt T.
        • Nordyke R.J.
        US payer perspectives on evidence for formulary decision making.
        Am J Manag Care. 2012; 18: SP71-SP76
        • Bridges J.F.P.
        Stated preference methods in health care evaluation: an emerging methodological paradigm in health economics.
        Appl Health Econ Health Policy. 2003; 2: 213-224
        • Hauber A.B.
        • González J.M.
        • Groothuis-Oudshoorn C.G.M.
        • et al.
        Statistical methods for the analysis of discrete choice experiments: a report of the ISPOR conjoint analysis good research practices task force.
        Value Health J Int Soc Pharmacoecon Outcomes Res. 2016; 19: 300-315
        • Bridges J.F.P.
        • Hauber A.B.
        • Marshall D.
        • et al.
        Conjoint analysis applications in health—a checklist: a report of the ISPOR Good Research Practices for Conjoint Analysis Task Force.
        Value Health. 2011; 14: 403-413
        • Berger M.L.
        • Sox H.
        • Willke R.J.
        • et al.
        Good practices for real-world data studies of treatment and/or comparative effectiveness: recommendations from the joint ISPOR-ISPE Special Task Force on real-world evidence in health care decision making.
        Pharmacoepidemiol Drug Saf. 2017; 26: 1033-1039
        • Makady A.
        • van Veelen A.
        • Jonsson P.
        • et al.
        Using real-world data in health technology assessment (HTA) practice: a comparative study of five HTA agencies.
        Pharmacoeconomics. 2018; 36: 359-368
      2. Standards for high-quality research and analysis. RAND Corporation.
        • Dhanda D.S.
        • Veenstra D.L.
        • Regier D.A.
        • Basu A.
        • Carlson J.J.
        Payer preferences and willingness to pay for genomic precision medicine: a discrete choice experiment.
        J Manag Care Spec Pharm. 2020; 26: 529-537
        • Reed Johnson F.
        • Lancsar E.
        • Marshall D.
        • et al.
        Constructing experimental designs for discrete-choice experiments: report of the ISPOR Conjoint Analysis Experimental Design Good Research Practices Task Force.
        Value Health J Int Soc Pharmacoecon Outcomes Res. 2013; 16: 3-13
        • de Bekker-Grob E.W.
        • Donkers B.
        • Jonker M.F.
        • Stolk E.A.
        Sample size requirements for discrete-choice experiments in healthcare: a practical guide.
        Patient Patient Centered Outcomes Res. 2015; 8: 373-384
      3. Ngene.
        Date accessed: December 1, 2020
        • Rose J.M.
        • Bliemer M.C.J.
        • Hensher D.A.
        • Collins A.T.
        Designing efficient stated choice experiments in the presence of reference alternatives.
        Transp Res B Methodol. 2008; 42: 395-406
        • Rose J.M.
        • Bliemer M.C.J.
        Stated preference experimental design strategies.
        Handbook of Transport Modelling. 1. Emerald Group Publishing Limited, Bingley, United Kingdom2007: 151-180
        • Street D.J.
        • Burgess L.
        • Viney R.
        • Louviere J.
        Designing discrete choice experiments for health care.
        in: Ryan M. Gerard K. Amaya-Amaya M. Using Discrete Choice Experiments to Value Health and Health Care. Springer, Dordrecht, The Netherlands2008: 47-72
        • Louviere J.J.
        • Hensher D.A.
        • Swait J.D.
        Stated Choice Methods: Analysis and Applications.
        Cambridge University Press, Cambridge, United Kingdom2000
        • Vermunt J.K.
        Latent class and finite mixture models for multilevel data sets.
        Stat Methods Med Res. 2008; 17: 33-51
        • R Core Team
        R version 4.0.3: A language and environment for statistical computing. R Foundation for Statistical Computing.
        Date accessed: March 27, 2018
        • Sándor Z.
        • Wedel M.
        Heterogeneous conjoint choice designs.
        J Mark Res. 2005; 42: 210-218
        • Greene W.H.
        • Hensher D.A.
        Does scale heterogeneity across individuals matter? An empirical assessment of alternative logit models.
        Transportation. 2010; 37: 413-428
        • Ammi M.
        • Peyron C.
        Heterogeneity in general practitioners’ preferences for quality improvement programs: a choice experiment and policy simulation in France.
        Health Econ Rev. 2016; 6: 44
        • Poulos C.
        • Wakeford C.
        • Kinter E.
        • Mange B.
        • Schenk T.
        • Jhaveri M.
        Patient and physician preferences for multiple sclerosis treatments in Germany: a discrete-choice experiment study.
        Mult Scler J Exp Transl Clin. 2020; 62055217320910778
        • Chalmers I.
        • Glasziou P.
        Avoidable waste in the production and reporting of research evidence.
        Lancet. 2009; 374: 86-89
        • Glasziou P.
        • Chalmers I.
        Research waste is still a scandal—an essay by Paul Glasziou and Iain Chalmers.
        BMJ. 2018; 363: k4645
        • Orsini L.S.
        • Berger M.
        • Crown W.
        • et al.
        Improving transparency to build trust in real-world secondary data studies for hypothesis testing—why, what, and how: recommendations and a road map from the real-world evidence transparency initiative.
        Value Health. 2020; 23: 1128-1136
        • Patorno E.
        • Schneeweiss S.
        • Wang S.V.
        Transparency in real-world evidence (RWE) studies to build confidence for decision-making: reporting RWE research in diabetes.
        Diabetes Obes Metab. 2020; 22: 45-59
        • Swait J.
        • Adamowicz W.
        The influence of task complexity on consumer choice: a latent class model of decision strategy switching.
        J Consum Res. 2001; 28: 135-148
        • Yang J.C.
        • Johnson F.R.
        • Kilambi V.
        • Mohamed A.F.
        Sample size and utility-difference precision in discrete-choice experiments: a meta-simulation approach.
        J Choice Model. 2015; 16: 50-57
        • Drummond M.
        • Griffin A.
        • Tarricone R.
        Economic evaluation for devices and drugs—same or different?.
        Value Health. 2009; 12: 402-404