Advertisement

Value of Information Analysis for Research Decisions—An Introduction: Report 1 of the ISPOR Value of Information Analysis Emerging Good Practices Task Force

      Highlights

      • Decision uncertainty, although not relevant to a risk-neutral decision maker identifying the optimal choice in the current circumstances, is of interest for addressing the question of whether to collect additional information to better inform future decisions. As such, probability distributions should be assigned to parameters to characterize uncertainty in the current evidence base, with probabilistic analysis (PA) used to assess the uncertainty. Parameters excluded from the PA will be excluded from the analysis of uncertainty.
      • A value of information (VOI) analysis provides a formal assessment of the value of research, based on the extent to which the information generated through research would improve the expected payoffs associated with a decision by reducing the uncertainty surrounding it. This value can then be compared with the cost of acquiring the information to determine whether the research is potentially worthwhile and of value to undertake.
      • This report was written to provide decision makers who have been tasked with making decisions about the adoption of healthcare or the funding of healthcare research with an introduction to the concept of VOI analysis and to the decisions that can be supported by this type of analysis, including: (1) research prioritization, (2) efficient research design, (3) reimbursement, and (4) efficient decision making over the life cycle.
      • The report describes the process of VOI analysis, providing a top-level description of the methods and steps involved in undertaking and interpreting the results of such an analysis, from conceptualizing the decision problem to developing the decision model, parameterizing the model, running the probabilistic analysis, calculating the value of information (perfect, partial perfect, and sample), and determining the worth of research (expected net benefit of sampling).
      • This report provides 9 recommendations for good practice when planning, undertaking, or reviewing the results of VOI analyses with the aim to improve accessibility of VOI analysis for all stakeholders.

      Abstract

      Healthcare resource allocation decisions made under conditions of uncertainty may turn out to be suboptimal. In a resource constrained system in which there is a fixed budget, these suboptimal decisions will result in health loss. Consequently, there may be value in reducing uncertainty, through the collection of new evidence, to make better resource allocation decisions. This value can be quantified using a value of information (VOI) analysis. This report, from the ISPOR VOI Task Force, introduces VOI analysis, defines key concepts and terminology, and outlines the role of VOI for supporting decision making, including the steps involved in undertaking and interpreting VOI analyses. The report is specifically aimed at those tasked with making decisions about the adoption of healthcare or the funding of healthcare research. The report provides a number of recommendations for good practice when planning, undertaking, or reviewing the results of VOI analyses.

      Keywords

      Introduction

      Value of Information Analysis in a Nutshell

      Healthcare decision makers, tasked with selecting which technologies to adopt, need to determine the payoffs associated with each. These payoffs, usually represented in cost-effectiveness analyses by net benefits (NB) expressed in either health or monetary terms, are uncertain, reflecting imperfect and incomplete evidence. This means that there is inevitably some risk that decisions based on the available information will be incorrect, with consequences in terms of payoffs. This introduces the possibility of error into decision making (decision uncertainty); there is a chance that the best decision made today is suboptimal, in the sense that better outcomes could have been achieved with a different decision had more information been available. Acquiring more information could reduce uncertainty in the evidence base and the associated risk and consequences of making the wrong decision in the future, but it can be costly. Before resources are invested in gathering additional information (eg, through research), the associated costs and benefits should be considered. A value of information (VOI) analysis provides a framework to assess these costs and benefits. Specifically, VOI analysis provides a formal assessment of the value of research, based on the extent to which the information improves the expected payoffs associated with a decision by reducing uncertainty. This value is compared with the cost of acquiring the information to determine whether it is worthwhile.

      A Task Force for VOI

      Although VOI analyses are being increasingly published in academic journals, uptake in real-world decision making remains limited.
      • Steuten L.M.G.
      • Van de Wetering G.
      • Groothuis-Oudshoorn K.
      • Retèl V.
      A systematic and critical review of the evolving methods and applications of value of information in academia and practice.
      This is partially due to perceptions that a VOI analysis is complex to perform, difficult to interpret, requires substantial computational time, and does not reflect key relevant uncertainties,
      • Bindels J.
      • Ramaekers B.
      • Ramos I.C.
      • et al.
      Use of value of information in healthcare decision making: exploring multiple perspectives.
      and partly due to lack of dissemination of methods and capacity to undertake this type of analysis.
      • Claxton K.
      • Eggington S.
      • Ginnelly L.
      • et al.
      A pilot study of value of information analysis to support research recommendations for NICE.
      As such, ISPOR formed the VOI Task Force to improve the accessibility of VOI analysis for all stakeholders through the development of good practice guidance for using VOI analysis to inform research prioritization (both private and public) and other decisions pertaining to the development and reimbursement of healthcare technologies (Box 1).

      VOI Task Force Reports

      This first report from the ISPOR VOI Task Force is directed at decision makers, including funders of research tasked with determining which studies to support, stakeholder groups identifying research priorities, and healthcare payers using formal assessment processes to inform their decisions about funding healthcare technologies.
      Background on the Task Force Process
      The proposal to initiate an ISPOR Value of Information (VOI) Good Practices Task Force was evaluated by the ISPOR Health Science Policy Council and then recommended to the ISPOR Board of Directors for approval. The task force was composed of international subject matter experts representing a diverse range of stakeholder perspectives (academia, research organizations, government, regulatory agencies, and commercial entities). The task force met approximately every 5 weeks by teleconference and in person at ISPOR conferences. All task force members reviewed many drafts of the report and provided frequent feedback in both oral and written comments. To ensure that ISPOR Good Practices Task Force Reports are consensus reports, findings and recommendations were presented and discussed at ISPOR conferences. In addition, the first and final draft reports were circulated to the task force’s review group for a formal review. All reviewer comments were considered. Comments were addressed as appropriate in subsequent versions of the report. Most were substantive and constructive, improving the report.
      The report introduces the concept of VOI analysis, identifies the decisions that can be supported by VOI analysis, and describes the methods, steps, and interpretation of results. The report presents emerging good practice recommendations throughout the text for planning, undertaking, or reviewing VOI analysis. The report does not discuss the cost or grading of evidence from specific studies or the responsibility for undertaking additional research when it is found to be valuable. The task force also developed a 26-word glossary to assist readers unfamiliar with VOI analysis.
      The second report from the ISPOR VOI Task Force,
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      which is directed at methodologists or analysts undertaking VOI analysis to inform decision making, provides detailed guidance and emerging good practices on the principal methods required for assessing the value of information to inform a range of decisions.
      Report 1 assumes an underlying cost-effectiveness framework where the objective is to maximize health on a limited budget; this is relaxed in Report 2 in which a more generic objective function is considered.
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      In both reports, technologies should be taken to refer to any healthcare, or public health, interventions, procedures, programs, or policies.

      How Can VOI Be Used? Healthcare Decisions Supported by VOI Analysis

      VOI analysis can inform a variety of healthcare decisions including: (1) research prioritization, (2) efficient research design, (3) reimbursement, and (4) efficient decision making over the life cycle. Box 2 provides a brief summary of the application of VOI analysis to decision making in healthcare, with references.
      Applications of VOI Analysis in Healthcare
      VOI analysis originated from Bayesian decision theory
      • Schlaifer R.
      Probability and Statistics for Business Decisions: An Introduction to Managerial Economics Under Uncertainty.
      • Raiffa H.
      • Schlaifer R.O.
      Applied Statistical Decision Theory.
      • Howard R.A.
      Information value theory.
      • Raiffa H.
      Decision Analysis: Introductory Lectures on Choices Under Uncertainty.
      and the theory of the economics of information.
      • Stigler G.J.
      The economics of information.
      Claxton and Meltzer formalized approaches for VOI analysis for research prioritization in healthcare.
      • Claxton K.
      The irrelevance of inference: a decision making approach to the stochastic evaluation of health care technologies.
      ,
      • Meltzer D.
      Addressing uncertainty in medical cost-effectiveness analysis. Implications of expected utility maximisation for methods to perform sensitivity analysis and the use of cost-effectiveness analysis to set priorities for medical research.
      Since then, VOI analysis has also been used to address the following:
      • Identification of efficient research design, conditional coverage, and early development
        • Claxton K.P.
        • Sculpher M.J.
        Using value of information analysis to prioritise health research: some lessons from recent UK experience.
        • Colbourn T.
        • Asseburg C.
        • Bojke L.
        • et al.
        Prenatal screening and treatment strategies to prevent group B streptococcal and other bacterial infections in early infancy: cost-effectiveness and expected value of information analyses.
        • Hassan C.
        • Hunink M.G.
        • Laghi A.
        • et al.
        Value-of-information analysis to guide future research in colorectal cancer screening.
      • Value of individualized care and precision medicine
        • Basu A.
        • Meltzer D.
        Value of information on preference heterogeneity and individualized care.
        ,
        • Basu A.
        • Carlson J.J.
        • Veenstra D.L.
        A framework for prioritizing research investments in precision medicine.
      • Value of regulatory trials from the perspective of the pharmaceutical industry
        • Breeze P.
        • Brennan A.
        Valuing trial designs from a pharmaceutical perspective using value based pricing.
      • Informing decisions about public and mental health interventions
        • Tuffaha H.W.
        • Roberts S.
        • Chaboyer W.
        • Gordon L.G.
        • Scuffham P.A.
        Cost-effectiveness and value of information analysis of nutritional support for preventing pressure ulcers in high-risk patients: implement now, research later.
        ,
        • Eeren H.V.
        • Schawo S.J.
        • Scholte R.H.J.
        • Busschbach J.J.V.
        • Hakkaart L.
        Value of information analysis applied to the economic evaluation of interventions aimed at reducing juvenile delinquency: an illustration.
      • Value of a sequence of trial designs, optimizing the order and respective sample sizes
        • Conti S.
        • Claxton K.
        Dimensions of design space: a decision-theoretic approach to optimal research design.
        ,
        • Griffin S.
        • Welton N.J.
        • Claxton K.
        Exploring the research decision space: the expected value of information for sequential research designs.
      • Value of promoting uptake of an evidence-based technology
        • Fenwick E.
        • Claxton K.
        • Sculpher M.J.
        The value of implementation and the value of information: combined and uneven development.
      • Value of managed entry agreements
        • Grimm S.
        • Strong M.
        • Brennan A.
        • Wailoo A.
        Framework for analyzing risk in health technology assessments and its application to managed entry agreements.
        ,
        • Grimm S.
        • Strong M.
        • Brennan A.
        • Wailoo A.
        The HTA risk analysis chart: visualizing the need for and potential value of managed entry agreements in health technology assessment.
      • Value of biomarker collection in clinical practice
        • Bansal A.
        • Basu A.
        Value of information methods for optimal timing of biomarker collection.
      • Value of subgroup information and value of identifying subgroups
        • Basu A.
        • Meltzer D.
        Value of information on preference heterogeneity and individualized care.
        ,
        • Basu A.
        • Carlson J.J.
        • Veenstra D.L.
        A framework for prioritizing research investments in precision medicine.
        ,
        • Sculpher M.
        Subgroups and heterogeneity in cost-effectiveness analysis.
        • van Gestel A.
        • Grutters J.
        • Schouten J.
        • et al.
        The role of the expected value of individualized care in cost-effectiveness analyses and decision making.
        • Espinoza M.A.
        • Manca A.
        • Claxton K.
        • Sculpher M.J.
        The value of heterogeneity for cost-effectiveness subgroup analysis.
      • Outcomes-based contracting for risk-averse manufacturers
        • Garrison L.P.
        • Towse A.
        • Briggs A.
        • et al.
        Performance-based risk-sharing arrangements-good practices for design, implementation, and evaluation: report of the ISPOR good practices for performance-based risk-sharing arrangements task force.
      • Portfolio balance-risk over multiple projects
        • Bennette C.S.
        • Veenstra D.L.
        • Basu A.
        • Baker L.H.
        • Ramsey S.D.
        • Carlson J.J.
        Development and evaluation of an approach to using value of information analyses for real-time prioritization decisions within SWOG, a large cancer clinical trials cooperative group.
        ,
        • Tuffaha H.W.
        • Gordon L.G.
        • Scuffham P.A.
        Value of information analysis informing adoption and research decisions in a portfolio of health care interventions.
      • Prioritizing the update of systematic literature reviews
        • Hoomans T.
        • Seidenfeld J.
        • Basu A.
        • Meltzer D.
        Systematizing the use of value of information analysis in prioritizing systematic reviews.
      • Alternative designs for research studies and program of studies (eg, Bayesian Clinical Trial Simulation of phase II and III programs)
        • Nixon R.M.
        • O'Hagan A.
        • Oakley J.
        • et al.
        The rheumatoid arthritis drug development model: a case study in Bayesian clinical trial simulation.
      Thorn et al
      • Thorn J.
      • Coast J.
      • Andronis L.
      Interpretation of the expected value of perfect information and research recommendations: a systematic review and empirical investigation.
      and Koffijberg et al
      • Koffijberg H.
      • Rothery C.
      • Chalkidou K.
      • Grutters J.
      Value of information choices that influence estimates: a systematic review of prevailing considerations.
      present the results of systematic reviews of VOI applications in healthcare.

      Research Prioritization Decisions

      Research funders have limited budgets, which necessitates prioritizing research investments. VOI analysis can support research prioritization and commissioning decisions by quantifying the value of the additional information generated by each proposed study. The proposals can then be ranked according to expected return, which is assessed by subtracting the expected costs from the value, to determine priorities.
      • Briggs A.H.
      • Sculpher M.J.
      • Claxton K.
      Decision Modelling for Health Economic Evaluation.
      • Tuffaha H.W.
      • Gordon L.G.
      • Scuffham P.A.
      Value of information analysis in healthcare: a review of principles and applications.
      • Eckermann S.
      • Karnon J.
      • Willan A.R.
      The value of value of information best informing research design and prioritization using current methods.
      Pilot projects to assess the feasibility of using VOI analysis for research prioritization have been undertaken in both the UK
      • Claxton K.P.
      • Sculpher M.J.
      Using value of information analysis to prioritise health research: some lessons from recent UK experience.
      ,
      • Claxton K.
      • Ginnelly L.
      • Sculpher M.J.
      • Philips Z.
      • Palmer S.
      A pilot study on the use of decision theory and value of information analysis as part of the NHS Health Technology Assessment programme.
      and the USA.
      • Myers E.
      • McBroom A.J.
      • Shen L.
      • Posey R.E.
      • Gray R.
      • Sanders G.D.
      Value-of-information analysis for patient-centered outcomes research prioritization. Duke Evidence-based Practice Center Durham.
      • Carlson J.J.
      • Thariani R.
      • Roth J.
      • et al.
      Value of information analysis within a stakeholder-driven research prioritization process in a US setting: an application in cancer genomics.
      • Sanders G.D.
      • Basu A.
      • Myers E.
      • Meltzer D.
      Potential value of an aspirin-dose trial for secondary prevention of coronary artery disease: informing PCORI and future trial design.

      Efficient Research Design

      VOI analysis can also inform research design to maximize the return on investment. This involves assessing the value of studies of different types (eg, primary or secondary) and scopes (in terms of study size, length of follow-up, etc) based on their ability to reduce uncertainty. This value is compared with the cost of the research to identify the most efficient design, identified as that which maximizes the expected return.
      • Briggs A.H.
      • Sculpher M.J.
      • Claxton K.
      Decision Modelling for Health Economic Evaluation.

      Reimbursement and Coverage With Evidence Development Decisions

      VOI analysis can be used to inform decisions about reimbursement of healthcare technologies when the evidence base to support their use is not mature, as is required in systems with early access arrangements. Use of a premature evidence base for decision making carries substantial risk and uncertainty for patient outcomes but also creates a disincentive to invest in further research that would reduce this uncertainty and risk. Reimbursing the technology could waste resources on cost-ineffective, or even harmful, practices which, once adopted, are difficult to eliminate.
      • Chalkidou K.
      • Lord J.
      • Fischer A.
      • Littlejohns P.
      Evidence-based decision making: when should we wait for more information?.
      Conversely, delaying adoption until further research is conducted could deny access to a clinically important and cost-effective technology. Decision makers must balance the value of delaying adoption until better information is available against the value of providing patients with early access. Here, the gain from waiting for more evidence (in terms of reduced uncertainty) should be weighed against the losses associated with a delay in adopting the technology (in terms of payoffs forgone) to determine the expected net gain.
      • Tuffaha H.W.
      • Roberts S.
      • Chaboyer W.
      • Gordon L.G.
      • Scuffham P.A.
      Cost-effectiveness and value of information analysis of nutritional support for preventing pressure ulcers in high-risk patients: implement now, research later.
      ,
      • Claxton K.
      • Palmer S.
      • Longworth L.
      • et al.
      Informing a decision framework for when NICE should recommend the use of health technologies only in the context of an appropriately designed programme of evidence development.
      ,
      • Al M.
      • Bindels J.
      • Ramos I.C.
      • et al.
      Uncertainty and value of information. Guideline for the conduct of economic evaluations in health care, Dutch version.
      Additionally, decision makers must decide whether actively gathering further evidence to reduce uncertainty is worthwhile.
      • Claxton K.
      • Palmer S.
      • Longworth L.
      • et al.
      Informing a decision framework for when NICE should recommend the use of health technologies only in the context of an appropriately designed programme of evidence development.
      In situations where it is possible to go beyond conventional “yes/no” reimbursement decisions to consider options for coverage with evidence development (CED), decision makers can also decide to provide conditional coverage while research is conducted.
      • Claxton K.
      • Palmer S.
      • Longworth L.
      • et al.
      Informing a decision framework for when NICE should recommend the use of health technologies only in the context of an appropriately designed programme of evidence development.
      ,
      • McCabe C.J.
      • Stafinski T.
      • Edlin R.
      • Menon D.
      Access with evidence development schemes: a framework for description and evaluation.
      ,
      • Stafinski T.
      • McCabe C.J.
      • Menon D.
      Funding the unfundable: mechanisms for managing uncertainty in decisions on the introduction of new and innovative technologies into healthcare systems.
      This is particularly important when reversing the adoption decision is difficult or costly, such as when there are significant sunk costs associated with adoption (eg, investments in equipment or buildings).
      • Claxton K.
      • Palmer S.
      • Longworth L.
      • et al.
      Informing a decision framework for when NICE should recommend the use of health technologies only in the context of an appropriately designed programme of evidence development.
      For example, in the presence of sunk costs, it may be more appropriate for a cost-effective technology where more research is worthwhile to be reimbursed only for those enrolled in research (“only in research”) rather than approved for general reimbursement while the research is conducted (“approval with research”). This avoids the commitment of irrecoverable costs until the results of research become known. The downside is that patients not involved in the research would not have access to the new technology during the research and, as such, would potentially miss out on the best technology. Claxton et al present a framework, based on VOI, for selecting between coverage options from a cost-effectiveness perspective in the presence of irrecoverable costs.
      • Claxton K.
      • Palmer S.
      • Longworth L.
      • et al.
      Informing a decision framework for when NICE should recommend the use of health technologies only in the context of an appropriately designed programme of evidence development.
      ,
      • Walker S.
      • Sculpher M.
      • Claxton K.
      • Palmer S.
      Coverage with evidence development, only in research, risk sharing or patient access scheme? A framework for coverage decisions.
      McKenna et al present a checklist outlining the sequence of assessments required to inform the different conditional coverage options (eg, “only in research” or “approval with research”) together with an illustration of the application of the checklist to an example in Enhanced External Counterpulsation Therapy.
      • McKenna C.
      • Soares M.
      • Claxton K.
      • et al.
      Unifying research and reimbursement decisions: case studies demonstrating the sequence of assessment and judgments required.

      Efficient Decision Making Over the Life Cycle

      VOI analysis should be undertaken early in the development of a technology and reassessed when an element of the decision changes (eg, research is published or a new comparator becomes available) to determine the impact on the reimbursement or research decisions.
      • Sculpher M.
      • Drummond M.
      • Buxton M.
      The iterative use of economic evaluation as part of the process of health technology assessment.
      • Fenwick E.
      • Claxton K.
      • Sculpher M.
      • Briggs A.
      Improving the efficiency and relevance of health technology assessment: the role of iterative decision analytic modelling.
      • Miller P.
      Role of pharmacoeconomic analysis in R&D decision making. When, where, how?.
      • Mohseninejad L.
      • Feenstra T.
      • van der Horst H.E.
      • Woutersen-Koch H.
      • Buskens E.
      Targeted screening for coeliac disease among irritable bowel syndrome patients: analysis of cost-effectiveness and value of information.
      • Mohseninejad L.
      • van Gils C.
      • Uyl-de Groot C.A.
      • Buskens E.
      • Feenstra T.
      Evaluation of patient registries supporting reimbursement decisions: the case of oxaliplatin for treatment of stage III colon cancer.
      When incorporated with CED, this early and iterative approach ensures efficient reimbursement and research decisions (what and when) over the lifetime of the technology.
      • Sculpher M.
      • Drummond M.
      • Buxton M.
      The iterative use of economic evaluation as part of the process of health technology assessment.
      • Fenwick E.
      • Claxton K.
      • Sculpher M.
      • Briggs A.
      Improving the efficiency and relevance of health technology assessment: the role of iterative decision analytic modelling.
      • Miller P.
      Role of pharmacoeconomic analysis in R&D decision making. When, where, how?.
      This aligns with the aim of the adaptive pathways approach, initiated by the European Medicines Agency, to balance timely access to technologies with the evolving nature of the evidence base through (re)assessment of evidence at different stages in the product’s life cycle.
      • Baird L.G.
      • Banken R.
      • Eichler H.G.
      • et al.
      Accelerated access to innovative medicines for patients in need.
      ,
      European Medicines Agency
      Adaptive pathways workshop, report on a meeting with stakeholders held at EMA on Thursday 8 December 2016.
      The approach also provides the opportunity for “stop/go” development decisions. When a technology is not effective or cost-effective on the basis of current evidence, and VOI analysis suggests that research is not worthwhile, then development of the technology should stop.
      • Retèl V.P.
      • Grutters J.P.C.
      • van Harten W.H.
      • Joore M.A.
      Value of research and value of development in early assessments of new medical technologies.

      Decision Making Under Uncertainty and the Role of VOI Analysis: A Framework

      Decisions made without complete information are inherently uncertain; there is a possibility that the decision is incorrect, with consequences in terms of the payoffs associated with the decision. Bayesian decision theory indicates that the optimal choice, for a risk-neutral decision maker, is to select the option with the maximum expected payoff irrespective of the uncertainty.
      • Claxton K.
      The irrelevance of inference: a decision making approach to the stochastic evaluation of health care technologies.
      ,
      • Meltzer D.
      Addressing uncertainty in medical cost-effectiveness analysis. Implications of expected utility maximisation for methods to perform sensitivity analysis and the use of cost-effectiveness analysis to set priorities for medical research.
      Nevertheless, decision uncertainty is of interest to ascertain the value of collecting additional information to better inform the decision in the future.
      • Claxton K.
      The irrelevance of inference: a decision making approach to the stochastic evaluation of health care technologies.
      ,
      • Meltzer D.
      Addressing uncertainty in medical cost-effectiveness analysis. Implications of expected utility maximisation for methods to perform sensitivity analysis and the use of cost-effectiveness analysis to set priorities for medical research.
      This involves a formal assessment of the decision uncertainty, not only in terms of the probability of making an error but also the consequences associated with an error (ie, the payoff forgone when decision uncertainty leads to the incorrect decision being taken). This provides an estimate of the expected costs of the uncertainty. VOI analysis establishes the value of research according to the extent to which it might reduce the expected costs of uncertainty by reducing uncertainty in the evidence base. Essentially, this involves comparing the expected value of a decision made with and without additional information. Where payoffs are expressed in terms of monetary NB, VOI analysis provides an explicit monetary valuation of the expected value of research that can be directly compared with the expected cost of research to determine whether it is worthwhile.
      The assessment of the value of information can be undertaken at different levels:
      • Expected value of perfect information (EVPI) quantifies the value of acquiring perfect information about all aspects of the decision (ie, eliminating all uncertainty). This is equivalent to the expected costs of uncertainty associated with making the decision based on the current evidence.
      • Expected value of partial perfect information (EVPPI), also sometimes called partial EVPI or expected value of perfect parameter information, quantifies the value of perfect information for a specific (group of) parameter(s) in the decision. It is the difference in the expected value of a decision made with perfect information for these parameter(s) and the expected value of the decision based on the current evidence.
      • Expected value of sample information (EVSI) quantifies the expected value associated with a given research study with a specific sample size and a particular design, which results in a reduction (rather than elimination) of uncertainty. It is the difference in the expected value of a decision made with reduced uncertainty and the expected value of the decision based on the current evidence.
      • Expected net benefit of sampling (ENBS) quantifies the net payoff for a given research study with a specific sample size and particular design (ie, the difference between the EVSI and the expected total cost of the study). Note that here, the cost of research not only includes the cost of performing the study but also the opportunity cost for participating individuals who will not benefit from the study.
        • McKenna C.
        • Claxton K.
        Addressing adoption and research design decisions simultaneously: the role of value of sample information analysis.
      Box 3 provides an overview of the sources of uncertainty in both model- and trial-based evaluations that induce uncertainty in payoffs. Published VOI analyses have principally focused on parameter uncertainty, which is the focus for this report; “Report 2” of the ISPOR VOI Task Force discusses sources of uncertainty in more detail.
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      Sources of Uncertainty
      There are different types and sources of uncertainty. For further details, see “Report 6” of the ISPOR-SMDM Modeling Good Research Practice Task Force.
      • Briggs A.H.
      • Weinstein M.C.
      • Fenwick E.A.L.
      • Karnon J.
      • Sculpher M.J.
      • Paltiel A.D.
      Model parameter estimation and uncertainty: a report of the ISPOR-SMDM modeling good research practices task force-6.
      • Parameter uncertainty is the uncertainty surrounding the “true” values of the parameters in a decision model due to imperfect knowledge or measurement (eg, uncertainty surrounding the mean duration of effect associated with a treatment in the population of interest).
      • Structural uncertainty
        Note that the distinction between parameter and structural uncertainty is somewhat artificial given that many structural choices could be parameterized.
        is the uncertain error that results from using a model that is not a perfect representation of reality and therefore relates to the assumptions employed within the construction of a model (eg, the health states used to describe the progression of disease in a model).
        • Briggs A.H.
        • Weinstein M.C.
        • Fenwick E.A.L.
        • Karnon J.
        • Sculpher M.J.
        • Paltiel A.D.
        Model parameter estimation and uncertainty: a report of the ISPOR-SMDM modeling good research practices task force-6.
        • Strong M.
        • Oakley J.E.
        When is a model good enough? Deriving the expected value of model improvement via specifying internal model discrepancies.
        • Ghabri S.
        • Hamers F.F.
        • Josselin J.M.
        Exploring uncertainty in economic evaluations of drugs and medical devices: lessons from the first review of manufacturers’ submissions to the French National Authority for Health.
        Note that the distinction between parameter and structural uncertainty is somewhat artificial given that many structural choices could be parameterized.
      • Stochastic uncertainty reflects uncertainty in the payoffs for any specific individual owing to chance (eg, experience of an adverse event following treatment).
      • Methodological “uncertainty”
        Because there is no single “correct” method of evaluation, differences in methodology do not strictly reflect uncertainty about the “truth.”
        • Briggs A.H.
        • Weinstein M.C.
        • Fenwick E.A.L.
        • Karnon J.
        • Sculpher M.J.
        • Paltiel A.D.
        Model parameter estimation and uncertainty: a report of the ISPOR-SMDM modeling good research practices task force-6.
        As such, and also because it is not possible to reduce methodological uncertainty through further research, it is not considered a source of uncertainty in the ISPOR VOI Task Force Reports.
        reflects uncertainty regarding analytic methods and choices (eg, the perspective taken in a cost-effectiveness analysis).
        • Drummond M.F.
        • Sculpher M.J.
        • Claxton K.
        • Stoddart G.L.
        • Torrance G.W.
        Methods for The Economic Evaluation of Health Care Programmes.
        Because there is no single “correct” method of evaluation, differences in methodology do not strictly reflect uncertainty about the “truth.”
        • Briggs A.H.
        • Weinstein M.C.
        • Fenwick E.A.L.
        • Karnon J.
        • Sculpher M.J.
        • Paltiel A.D.
        Model parameter estimation and uncertainty: a report of the ISPOR-SMDM modeling good research practices task force-6.
        As such, and also because it is not possible to reduce methodological uncertainty through further research, it is not considered a source of uncertainty in the ISPOR VOI Task Force Reports.
      • Heterogeneity
        Because heterogeneity does not reflect uncertainty about the truth, it is not considered a source of uncertainty in the ISPOR VOI Task Force Reports.
        reflects known differences in individual-level parameter values associated with identifiable differences in patient characteristics including demographics (eg, age, gender, income), preferences (eg, attitudes or beliefs), and/or clinical characteristics (eg, disease severity, disease history, genetic profile).
        • Grutters J.P.
        • Sculpher M.
        • Briggs A.H.
        • et al.
        Acknowledging patient heterogeneity in economic evaluation: a systematic literature review.
        Because heterogeneity does not reflect uncertainty about the truth, it is not considered a source of uncertainty in the ISPOR VOI Task Force Reports.

      Overview of the Steps for Conducting and Reporting a VOI Analysis

      The process of undertaking a VOI analysis is illustrated graphically in Figure 1.

      Constructing the Decision-Analytic Model

      The initial step involves constructing a decision-analytic model to represent the problem. As detailed in “Report 2” of the ISPOR-SMDM Modeling Good Research Practice Task Force, this should involve a two-stage conceptualization process, which (1) converts knowledge into a representation of the problem and (2) identifies a specific model type to meet the needs of the problem.
      • Roberts M.
      • Russell L.B.
      • Paltiel A.D.
      • Chambers M.
      • McEwan P.
      • Krahn M.
      Conceptualizing a model: a report of the ISPOR-SMDM modeling good research practices task force-2.
      The process requires a clear statement of the decision problem, the modeling objective, and the scope, including the disease area considered, the analytic perspective, the target population, the alternative technologies, health and other outcomes of interest, and the time horizon of the analysis.

      Uncertainty in the Current Evidence Base

      The next step involves characterizing the uncertainty in the current evidence base. This involves assigning a (joint) probability distribution for the model parameters, accounting for any correlations between parameters. Expressing the uncertainty surrounding the “true” value of a parameter involves identifying the range of values that could be reasonably attributed to it and the likelihood that the parameter takes any specific value in this range. These should be informed by the best available evidence. Guidelines exist to aid the selection of distributions for parameters.
      • Briggs A.H.
      • Sculpher M.J.
      • Claxton K.
      Decision Modelling for Health Economic Evaluation.
      Probability distributions must be assigned to all uncertain parameters; otherwise, they will be excluded from the analysis of uncertainty and assessment of VOI. Excluding a parameter because there is little or no information with which to estimate it is equivalent to assuming it is known with certainty. These are precisely the parameters that need to be included, with a wide distribution, to represent this uncertainty. Where evidence is limited, probability distributions for uncertain parameters can be derived using expert elicitation.
      • O'Hagan A.
      • Buck C.E.
      • Daneshkhah A.
      • et al.
      Uncertain Judgements: Eliciting Experts’ Probabilities.
      • Oakley J.E.
      Eliciting univariate probability distributions.
      • Daneshkhah A.
      • Oakley J.E.
      Eliciting multivariate probability distributions.

      Good Practice Recommendation 1

      Probability distributions should be assigned to all uncertain parameters to reflect the evidence base.

      Probabilistic Analysis

      A complete assessment of the uncertainty in the existing evidence base requires assessment of the uncertainty in all parameters simultaneously.
      • Neumann P.J.
      • Ganiats T.G.
      • Russell L.B.
      • Sanders G.D.
      • Siegel J.E.
      Cost-Effectiveness in Health and Medicine.
      This is achieved through probabilistic analysis (PA). In addition, where there are nonlinearities within a model structure (eg, the relationship between payoffs and health state transition probabilities in a Markov model), only PA will correctly determine the expected payoffs. A deterministic analysis (evaluating the model at the mean parameter values) will result in error. Deterministic analysis can provide some information about the sensitivity of a decision to a parameter value but has the potential to mislead, especially in models with multiple correlated parameters. As such, and in line with the ISPOR-SMDM Modeling Task Force (“Report 6”) and the 2nd Panel on Cost-Effectiveness, this task force recommends the use of PA.
      • Briggs A.H.
      • Sculpher M.J.
      • Claxton K.
      Decision Modelling for Health Economic Evaluation.
      ,
      • Neumann P.J.
      • Ganiats T.G.
      • Russell L.B.
      • Sanders G.D.
      • Siegel J.E.
      Cost-Effectiveness in Health and Medicine.

      Good Practice Recommendation 2

      Use probabilistic analysis (PA), which accounts for uncertainty in parameters simultaneously, for an appropriate quantitative assessment of payoffs and associated uncertainty.

      Assessing Uncertainty

      Within PA, Monte Carlo sampling is used to propagate the uncertainty in parameters through the decision model. This involves drawing parameter values from each of the joint parameter distributions and running the model, using the selected set of parameter values, to provide an estimate of the outcomes of interest for each option being evaluated. The process is repeated many times (eg, 1 000) to generate a distribution for each outcome of interest. Each iteration, in the distribution, represents a possible realization of the truth as captured by the PA. The average of the distribution provides the expected value of each outcome of interest.
      In CEA, the individual iterations for the outcomes of interest (costs and quality-adjusted life-years [QALYs]) are plotted on an incremental cost-effectiveness plane, providing a graphical representation of the joint uncertainty in payoffs (Fig. 2). The spread of the points in the horizontal plane illustrates the uncertainty surrounding the incremental QALYs, the spread in the vertical plane illustrates the uncertainty surrounding the incremental costs.
      Figure thumbnail gr2
      Figure 2Cost-effectiveness plane.
      Technology A involves considerable uncertainty in both the incremental costs (vertical plane) and incremental effects (horizontal plane), but there is no decision uncertainty at the cost-effectiveness threshold shown. Technology B involves decision uncertainty despite less uncertainty in payoffs (a more compact cloud of points) as the joint distribution crosses the cost-effectiveness threshold shown (λ). QALY indicates quality-adjusted life-year.

      Decision Uncertainty

      VOI analysis focuses on the quantification of, and consequences associated with, decision uncertainty. Uncertainty in outcomes does not necessarily translate into decision uncertainty. Assessment of decision uncertainty requires a decision rule with which to compare the payoffs associated with the different options. In CEA, the standard decision rule involves comparing the incremental cost-effectiveness ratio (ICER) with a predefined cost-effectiveness threshold value (λ) specified in terms of cost per additional unit of health outcome (eg, QALY). Decision uncertainty may be minimal or even absent, even when costs and effects are highly uncertain. This would be the case when the entire joint distribution of expected incremental costs and incremental effects falls above/below the threshold (λ) (eg, see technology A in Fig. 2).

      Establishing Whether More Research is Potentially Worthwhile

      In addition to identifying the decision uncertainty, the results of the PA may be used to calculate the expected cost of the uncertainty given current evidence. Given that the ideal research would resolve all uncertainty, this expression can also be interpreted as the expected value of perfect information (EVPI).
      Once gathered, information would be of value every time a choice is made between the options represented by the decision; as such, the EVPI should be scaled up for the beneficiary population. This population represents those that could potentially benefit from the information. Determining this population involves assessing the current (prevalent) and future (incident) cohorts for the time frame over which the decision would be relevant. Although estimating the future incidence and prevalence of the disease may be done with a reasonable degree of accuracy, determining the effective time horizon over which the decision is expected to be relevant—and thus over which the incidence and prevalence of the population should be calculated—is less straightforward.
      • Wilson E.C.
      A practical guide to value of information analysis.
      Factors that might be considered include time to patent expiry of the technology, time to launch of an in-class substitute therapy, availability of diagnostic and screening tests that could change the size of the beneficiary population, and any anticipated price changes of the technologies. Assessments of these could be based on past empirical evidence or on priors elicited from experts. Published studies tend to present results for a population based on a time horizon of 1 year and then 10 to 20 years; however, there is no clear justification for this. This is a concern because VOI estimates can be highly sensitive to the effective time horizon of the decision.
      • Philips Z.
      • Claxton K.
      • Palmer S.
      The half-life of truth: what are appropriate time horizons for research decisions?.

      Good Practice Recommendation 3

      Justify the effective time horizon chosen and explore the impact of alternative time horizons on the value of information (VOI) results in scenario analyses.
      Note that the benefits of any study would not be realized until the study is completed. Therefore, the beneficiary population, as calculated based on prevalence or incidence, is usually adjusted to reflect the time it will take for a study to finish.
      • Willan A.R.
      • Pinto E.M.
      The value of information and optimal clinical trial designs.
      ,
      • Willan A.
      • Briggs A.
      Power and size determination: the value of information approach.
      Nevertheless, those study participants enrolled in the optimal arm will also receive the benefits of the optimal technology while the study is conducted.
      • McKenna C.
      • Claxton K.
      Addressing adoption and research design decisions simultaneously: the role of value of sample information analysis.
      The impact of this on the VOI depends on the size of the population, relative to the sample size of the study, and is thus much more pronounced in rare diseases.
      Estimating the appropriate size of the beneficiary population, especially where information might be generalizable and hence valuable across multiple jurisdictions,
      • Tuffaha H.W.
      • Gordon L.G.
      • Scuffham P.A.
      Value of information analysis in healthcare: a review of principles and applications.
      and establishing methods to assess the global value of information, are key priority areas for methods research.
      • Wilson E.C.
      A practical guide to value of information analysis.
      ,
      • Eckermann S.
      • Willan A.R.
      Globally optimal trial design for local decision making.
      ,
      • Eckermann S.
      • Willan A.R.
      Optimal global value of information trials: better aligning manufacturer and decision maker interests and enabling feasible risk sharing.

      Good Practice Recommendation 4

      The size of the beneficiary population should be calculated based on the prevalent and/or incident cohorts as appropriate given the decision problem. This should be adjusted for the number of patients to be enrolled in a future study if the reimbursement decision is delayed while more information is gathered because these patients will generally not benefit from the information yielded.
      Because perfect information is not achievable through research with a finite sample size, EVPI cannot establish that research is worthwhile. EVPI can only provide a measure of the expected maximum payoff that could result from research (ie, an explicit expected upper limit on the value of further research that would eliminate all decision uncertainty).
      • Claxton K.
      Value of information analysis.
      This expected maximum value can be compared with the cost of gathering further information to determine whether further research is potentially worthwhile, providing a necessary, but not sufficient, condition for determining the value of further research.
      • Claxton K.
      • Posnett J.
      An economic approach to clinical trial design and research priority-setting.
      If the population EVPI is less than the estimated cost of research, this provides a sufficient condition for establishing that future research is not of value to the decision maker. In this circumstance, the VOI process should stop. If the population EVPI is greater than the estimated cost of research, this provides a necessary, but not sufficient, condition to suggest that research is potentially worthwhile. In this circumstance, the VOI process continues to examine the value of more targeted information.

      Good Practice Recommendation 5

      Compare population expected value of perfect information (EVPI) with the expected costs of research to determine if further research is potentially worthwhile. Where the expected costs of research exceed the EVPI, research is not worthwhile and the VOI process should stop.
      Box 4 presents a worked example illustrating how to calculate the EVPI and population EVPI directly from the results of the PA. The appendix (in Supplemental Materials found at https://doi.org/10.1016/j.jval.2020.01.001) contains an intuitive explanation of the value of information.
      EVPI: A Worked Example
      A mathematical definition of EVPI is presented in ISPOR VOI Task Force Report 2.
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      EVPI can also be understood by showing how it is calculated from the results of the PA. Table 1 shows 5 iterations from a PA, each of which represents a possible realization of uncertainty relating to the choice between treatments X, Y, and Z. These iterations show the uncertainty in net health benefit (NHB) relating to existing evidence (columns 2, 3, and 4). The cost-effective (optimal) treatment, based on current evidence, for a risk-neutral decision maker is that which generates the highest expected (mean) NHB. This is treatment Y with 13 NHB. The error probability associated with the decision is 60% (iterations 1, 2, and 4, where either X or Z is actually the optimal choice).
      Perfect information would remove all uncertainty, which is equivalent to the decision maker being able to select the optimal treatment in each iteration. This is shown in column 5 for each of the possible realizations of the uncertainty. If the uncertainty resolves as represented by iterations 3 or 5, then Y turns out to be the optimal treatment. Nevertheless, if the uncertainty resolves as represented by iterations 1, 2, or 4, the optimal treatment is either Z (iteration 1) or X (iterations 2 and 4).
      Column 6 presents the NHB associated with the optimal treatment (as identified in column 5) for each iteration. This is equivalent to the maximum NHB for each iteration. Given the uncertainty is not yet resolved, but that each iteration is equally likely to occur, the expected value of the decision with perfect information is calculated as the mean of these maximum NHB (ie, 15 NHB). The EVPI is given by the difference between the expected value of the decision with and without perfect information (ie, by subtracting the mean of column 3 from the mean of column 6). In this example, the value of the decision with perfect information is 15 NHB. Without perfect information, it is 13; hence, the EVPI = 2 NHB.
      Alternatively, the EVPI can be determined by calculating the opportunity loss associated with making a decision based on the current evidence in each iteration (column 7) and then averaging over all iterations. In this example, based on the current evidence, treatment Y is chosen (by a risk-neutral decision maker). Where uncertainty has not led to an incorrect decision (eg, iterations 3 and 5), there is no opportunity loss associated with the current level of evidence. Nevertheless, an incorrect decision was made based on the current evidence in iterations 1, 2, and 4; hence, there is some opportunity loss associated with these iterations. Averaging the opportunity loss over all iterations gives the expected value of perfect information; in this example, the EVPI = 2 NHB (as given earlier).
      If the beneficiary population is estimated at 100 people per annum and the time horizon for the decision is estimated as 5 years, the population EVPI = 943 NHB.
      Note that owing to discounting, this is not exactly equal to 100 people multiplied by 5 years multiplied by 2 NHB.
      Note that owing to discounting, this is not exactly equal to 100 people multiplied by 5 years multiplied by 2 NHB.
      Assuming a threshold of $50 000 per QALY, this equates to a population EVPI of $47 million. If the expected cost of research exceeds $47 million, further research will not be of value, but if the expected cost of research is less than $47 million, research is potentially valuable to the decision maker.
      Table 1Calculating the EVPI from the results of a probabilistic analysis.
      Possible realizations of the uncertainty (ie, how things could turn out)NHB associated with treatmentNHB that would be achievable if perfect information were available to determine the best choice in each iterationOpportunity loss from lack of perfect information
      Column 1Column 2Column 3Column 4Column 5Column 6Column 7
      Treatment XTreatment YTreatment ZBest choice in each iteration
      Iteration 171215Z153
      Iteration 216119X165
      Iteration 391411Y140
      Iteration 4131110X132
      Iteration 5101715Y170
      Mean over all iterations111312-152
      EVPI indicates expected value of perfect information; NHB, net health benefit.

      Identifying Parameters Where Further Research is Most Valuable

      Where further research appears potentially worthwhile based on the population EVPI, the next step is to identify which particular aspects of a decision problem are potentially worth studying to resolve the uncertainty surrounding them. This can be done by estimating the expected value of partial perfect information (EVPPI) for specific (groups of) parameters and comparing the population EVPPI to the expected costs of research. Wherever new information is expected to be informative for a group of parameters, EVPPI should be calculated for the group rather than calculated separately for the individual parameters within the group because EVPPI is typically not additive.

      Good Practice Recommendation 6

      Expected value of partial perfect information (EVPPI) should be undertaken for groups of parameters where it is likely that a new study (or studies) would be informative for the whole group rather than for individual parameters.
      Calculation of EVPPI has traditionally involved a nested double-loop Monte Carlo sampling scheme. First, a value is sampled for the (group of) parameter(s) of interest (ie, those for which uncertainty is to be resolved [outer loop]). Then, a PA is undertaken with the parameter(s) fixed at the sampled value(s) whereas all other parameters vary as before (inner loop). For each outer loop, given the sampled value(s) for the parameter(s) of interest, the optimal decision (ie, the technology associated with the maximum expected NB) is identified. The process is repeated for different sample values for the parameter(s) of interest (outer loops), each time identifying the optimal decision and the maximum expected NB associated with the choice. Averaging the maximum expected NB over all of the outer loops provides the expected value of the decision with perfect information for the parameter(s) of interest. Subtracting the expected NB with current information (as calculated for EVPI) gives the EVPPI.
      Further details of the calculation of EVPPI, including a range of methods to simplify the process, are described in Report 2 of the ISPOR VOI Task Force.
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      Note that the parameters with the highest EVPPI will not necessarily correspond to the parameters that are most uncertain. EVPPI will only be high for those parameters for which parameter uncertainty drives decision uncertainty. There will only be value associated with reducing uncertainty for parameter(s) where this may change the decision, and hence, the decision payoffs. Eliminating uncertainty in a very uncertain but unimportant parameter (ie, one that does not impact the decision) will have no value to the decision maker.
      As with the EVPI, the population EVPPI provides an expected upper bound on the value of additional research for specific (groups of) parameters.

      Good Practice Recommendation 7

      Estimates of population EVPPI should be compared with the expected costs of research on specific (groups of) parameters to determine whether research is potentially valuable.

      Estimating the Value of Specific Research

      If further research for specific (groups of) parameters appears potentially worthwhile based on the population EVPPI, the next step is to determine whether specific research is worthwhile. This involves establishing that the population expected value of sample information (EVSI) exceeds the expected cost of undertaking specific research.
      Determining the EVSI involves determining the reduction in the expected costs of uncertainty associated with specific research. This will depend on the “informativeness” of the research (ie, the extent to which uncertainty and the associated consequences are reduced by the information provided by the research). This is a function of the design of the research study, including the sample size and allocation, length of follow-up, and the endpoints of interest. These factors will also impact the cost of research. For example, further research on relative treatment effects may require a randomized controlled trial, whereas observational studies might be suitable for other parameters (such as cost or health-related quality of life associated with particular clinical events).
      The process of identifying the optimal research design requires the ability to appropriately estimate the costs associated with specific research designs.
      • Welton N.J.
      • Madan J.J.
      • Caldwell D.M.
      • Peters T.J.
      • Ades A.E.
      Expected value of sample information for multi-arm cluster randomized trials with binary outcomes.
      • Hind D.
      • Reeves B.C.
      • Bathers S.
      • et al.
      Comparative costs and activity from a sample of UK clinical trials units.
      • van Asselt T.
      • Ramaekers B.
      • Corro Ramos I.
      • et al.
      Research costs investigated: a study into the budgets of Dutch publicly funded drug-related research.
      These expected costs include 3 components: (1) fixed costs, (eg, set-up costs, salaries), (2) variable cost per study participant, and (3) opportunity cost for those participants who receive the technology that is expected to be inferior while the study is underway.
      • Eckermann S.
      • Karnon J.
      • Willan A.R.
      The value of value of information best informing research design and prioritization using current methods.
      ,
      • McKenna C.
      • Claxton K.
      Addressing adoption and research design decisions simultaneously: the role of value of sample information analysis.
      The total cost is commonly determined from a societal perspective. Nevertheless, it may also be considered from the perspective of the sponsor of the study.
      EVSI is traditionally calculated using a nested double-loop Monte Carlo process similar to that used for determining EVPPI. Detailed methods to calculate the EVSI are described in Report 2 of the ISPOR VOI Task Force
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      along with methods to simplify the process in order to reduce computational load.

      Good Practice Recommendation 8

      Compare population expected value of sample information (EVSI), for the proposed study design, to the expected costs of the study to determine if the specific study is worthwhile.

      Identifying the Appropriate Research Design

      The difference between the population EVSI for a specific study and its expected total cost gives the net payoff or expected net benefit of sampling (ENBS) associated with the specific research design. A positive ENBS indicates that the benefits (population EVSI) associated with a specific research study are expected to outweigh the costs and the research is worthwhile. This provides a sufficient condition regarding the worth of a specific research study. A negative ENBS indicates that a research study of a specific size and design would not be worthwhile. In this circumstance, a redesigned study (fewer participants, shorter follow-up) might improve the ENBS and turn out to be worthwhile. EVSI and ENBS should be calculated for a range of study designs (in terms of sample size and allocation, length of follow-up, endpoints of interest, etc) to identify the most efficient design (ie, the design that generates the maximum ENBS).
      • Neumann P.J.
      • Ganiats T.G.
      • Russell L.B.
      • Sanders G.D.
      • Siegel J.E.
      Cost-Effectiveness in Health and Medicine.
      When no design has a positive ENBS, the current available evidence should be considered sufficient for decision making.
      • Willan A.R.
      • Goeree R.
      • Boutis K.
      Value of information methods for planning and analyzing clinical studies to optimize decision making and planning.

      Good Practice Recommendation 9

      Identify the most efficient study design as that with the greatest expected net benefit of sampling (ENBS).

      Reporting and Interpreting Results

      In CEA, the cost-effectiveness acceptability curve provides a graphical summary of the uncertainty in cost-effectiveness associated with each of the technologies being considered, while the cost-effectiveness acceptability frontier presents the decision uncertainty for a range of values of the cost-effectiveness threshold.
      • Fenwick E.
      • Claxton K.
      • Sculpher M.
      Representing uncertainty: the role of cost-effectiveness acceptability curves.
      Similarly, the results of VOI analyses should be presented graphically for a range of values of the cost-effectiveness threshold. Where there are explicit thresholds of interest, results of VOI (individual and population level) that correspond to these thresholds should be emphasized in text, tables, or figures. Where EVSI and ENBS calculations are undertaken, these results should be presented for the different research designs considered for a range of values of the cost-effectiveness threshold.
      All assumptions made during the analysis regarding the policy options available to the decision maker should be clearly stated.
      • Roberts M.
      • Russell L.B.
      • Paltiel A.D.
      • Chambers M.
      • McEwan P.
      • Krahn M.
      Conceptualizing a model: a report of the ISPOR-SMDM modeling good research practices task force-2.
      ,
      • Neumann P.J.
      • Ganiats T.G.
      • Russell L.B.
      • Sanders G.D.
      • Siegel J.E.
      Cost-Effectiveness in Health and Medicine.
      Results of VOI analysis should be used to guide decision making under uncertainty, with the understanding that a positive EVPI or EVPPI is a necessary, but not sufficient, prerequisite to decide that further research is potentially valuable.

      Other Considerations and Potential Challenges

      Additional Considerations for Decision Makers

      This ISPOR VOI Task Force Report has described how to generate and use the results of a VOI analysis for reimbursement and research prioritization decisions based on the premise of efficiency (ie, the decision maker wants to maximize health, through reimbursement and research, from a limited budget) and an assumption of risk-neutrality. The assumption of risk-neutrality has been challenged by some, even for a population-level decision maker, and alternative decision criteria that account for the risk aversion of a decision maker may be applied for adoption decisions in some circumstances.
      • Al M.J.
      • Feenstra T.L.
      • Van Hout B.A.
      Optimal allocation of resources over health care programmes: dealing with decreasing marginal utility and uncertainty.
      ,
      • Basu A.
      • Meltzer D.
      Decision criterion and value of information analysis: optimal aspirin dosage for secondary prevention of cardiovascular events.
      In addition, Koffijberg et al demonstrated that, where decision makers have additional criteria of interest (eg, public opinion, ethical issues, and budget constraints), this can substantially impact the value of information.
      • Koffijberg H.
      • Knies S.
      • Janssen M.P.
      The impact of decision makers’ constraints on the outcome of value of information analysis.
      In these circumstances, the standard outputs from a VOI analysis may not accurately reflect the value of information to the decision maker and should serve as a valuable, but not sole, input to the overall decision-making process.
      Other factors with potential relevance to decision making include: (1) the likelihood that research will be undertaken if the technology is widely reimbursed rather than being funded only in the context of research, (2) the extent of irreversible costs incurred in delivering a new technology, (3) whether other information of relevance is likely to emerge over time, and (4) the size of the beneficial population for rare diseases. These issues are dealt with in more detail in Report 2 of the ISPOR VOI Task Force
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      and in the 2nd Panel on Cost-Effectiveness.
      • Neumann P.J.
      • Ganiats T.G.
      • Russell L.B.
      • Sanders G.D.
      • Siegel J.E.
      Cost-Effectiveness in Health and Medicine.

      Alternatives to VOI

      Other approaches that have been used to assess the value of undertaking further research include: using implicit approaches where experts express their opinions as to the importance of different evaluative research, utilizing an assessment of burden of disease to identify priority topics, focusing research on areas where there are large variations in practice, and methods that determine the prospective payback from research in terms of the improvements in payoffs that accrue from changes in clinical practice that are assumed to follow the results of that research.
      The payback approach is similar to VOI in estimating the value of research in terms of improved payoffs; however, with payback methods, it is not possible to identify the extent to which the payoffs are improved through the research per se, as opposed to the resulting change in implementation.
      • Hanney S.
      • Mugford M.
      • Grant J.
      • Buxton M.
      Assessing the benefits of health research: lessons from research into the use of antenatal corticosteroids for the prevention of neonatal respiratory distress syndrome.
      • Detsky A.S.
      Using economic analysis to determine the resource consequences of choices made in planning clinical trials.
      • Detsky A.S.
      Using cost-effectiveness analysis to improve the efficiency of allocating funds to clinical trials.
      • Drummond M.F.
      • Davies L.M.
      • Ferris F.L.
      Assessing the costs and benefits of medical research: the diabetic retinopathy study.
      • Davies L.
      • Drummond M.
      • Papanikoloau P.
      Prioritising investments in health technology assessment: can we assess potential value for money?.
      • Chilcott J.
      • Brennan A.
      • Booth A.
      • Karnon J.
      • Tappenden P.
      The role of modelling in prioritising and planning clinical trials.
      This is important because research is not necessarily the most efficient or only way to change clinical practice.

      Information Versus Implementation

      One of the assumptions of VOI analysis is that clinical practice aligns perfectly with decision making (ie, all clinicians will implement an option when this is the optimal choice on the basis of expected payoffs, even in the absence of a statistically significant clinical effect size).
      • Andronis L.
      • Barton P.
      Adjusting estimates of the expected value of information for implementation: theoretical framework and practical application.
      In reality, implementation is often imperfect for a variety of reasons, including the existence of different perspectives or incentives or asymmetries of information. Imperfect implementation of cost-effective technologies reduces the efficiency of the healthcare system. As a result, there has been growing interest in implementation strategies to improve uptake and adherence among practitioners.
      Fenwick et al present a framework, similar to the VOI framework, to formally assess the value of strategies to improve implementation.
      • Fenwick E.
      • Claxton K.
      • Sculpher M.J.
      The value of implementation and the value of information: combined and uneven development.
      ,
      • Chalkidou K.
      • Lord J.
      • Fischer A.
      • Littlejohns P.
      Evidence-based decision making: when should we wait for more information?.
      ,
      • Hoomans T.
      • Fenwick E.A.L.
      • Palmer S.
      • Claxton K.
      Value of information and value of implementation: application of an analytic framework to inform resource allocation decisions in metastatic hormone-refractory prostate cancer.
      ,
      • Grimm S.
      • Dixon S.
      • Stevens S.
      Assessing the expected value of research studies in reducing uncertainty and improving implementation dynamics.
      The value of implementation approach differs from the payback methods by providing a framework that allows the separate, but linked, decisions regarding investment in research and investment in implementation activities to be made simultaneously.

      Limitations of VOI

      The extent to which any VOI analysis is sufficient to address the question of further research is conditional on the model and on the specification of parameter uncertainty. As such, it is imperative that the uncertainty in the current evidence base is appropriately considered and included in the PA. Nevertheless, it is impossible for any assessment, no matter how carefully undertaken, to assess and incorporate unknown unknowns. As such, there will always be occasions when new evidence, which may not have been identified as required, causes a paradigm shift. This should not prevent the use of VOI analysis but rather encourage creativity when developing models and assessing uncertainty.

      Potential Challenges

      This report has focused primarily on the processes for pharmaceuticals, where developing decision-analytic models is often required to comply with the process of reimbursement or price negotiation (eg, in the UK, The Netherlands, and France). When a decision-analytic model is not available, or feasible, at the time of funding (eg, as in nonpharmaceutical interventions and public health programs), a “rapid VOI” or minimal modeling approach may be considered. This allows rapid estimation of the value of further research. For further details, see the second report from the ISPOR VOI Task Force.
      • Rothery C.
      • Strong M.
      • Koffijberg H.
      • et al.
      Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
      In situations where payers and funders of possible additional research are not one and the same, close collaboration will be required in order to undertake, and implement the results of, VOI analysis.

      Conclusions

      This first report of the ISPOR VOI Task Force demonstrates the importance of VOI to decision makers, introducing readers to the concepts of VOI analysis and outlining decisions that can be supported by VOI analysis. It also provides an overview of the steps required to conduct a VOI analysis and shows how the results should be calculated, used, and interpreted. Box 5 provides a summary of the good practice recommendations for conducting and reviewing VOI analyses that were presented throughout this report. Report 2 of the ISPOR VOI Task Force 4 provides guidance on implementation of VOI analysis, with step-by-step algorithms, information about efficient computational approaches, and details of available software.
      ISPOR Value of Information Analysis Task Force Report’s Good Practice Recommendations for Conducting and Reporting a VOI Analysis
      • 1.
        Probability distributions should be assigned to all uncertain parameters to reflect the evidence base.
      • 2.
        Use probabilistic analysis (PA), which accounts for uncertainty in parameters simultaneously, for an appropriate quantitative assessment of payoffs and associated uncertainty.
      • 3.
        Justify the effective time horizon chosen and explore the impact of alternative time horizons on the value of information (VOI) results in scenario analyses.
      • 4.
        The size of the beneficiary population should be calculated based on the prevalent and/or incident cohorts as appropriate given the decision problem. This should be adjusted for the number of patients to be enrolled in a future study if the reimbursement decision is delayed while more information is gathered because these patients will generally not benefit from the information yielded.
      • 5.
        Compare population expected value of perfect information (EVPI) with the expected costs of research to determine whether further research is potentially worthwhile. Where the expected costs of research exceed the EVPI, research is not worthwhile and the VOI process should stop.
      • 6.
        Expected value of partial perfect information (EVPPI) should be undertaken for groups of parameters where it is likely that a new study (or studies) would be informative for the whole group rather than for individual parameters.
      • 7.
        Estimates of population EVPPI should be compared with the expected costs of research on specific (groups of) parameters to determine whether research is potentially valuable.
      • 8.
        Compare population expected value of sample information (EVSI), for the proposed study design, with the expected costs of the study to determine if the specific study is worthwhile.
      • 9.
        Identify the most efficient study design as that with the greatest expected net benefit of sampling (ENBS).

      Acknowledgments

      The co-authors thank all those who commented orally during four task force workshop and forum presentations at ISPOR conferences in the US and Europe in 2017 and 2018, as well as those who attended the VOI short courses. The co-authors gratefully acknowledge the following 22 reviewers who generously contributed their time and expertise through submission of written comments on the task force reports: Renee Allard, Gianluca Baio, Alan Brennan, Talitha Feenstra, Aline Gauthier, Sean Gavin, Amer Hayat, Anna Heath, Chris Jackson, Jennifer Kibicho, Joanna Leśniowska, Ka Keat Lim, Amr Makady, Brett McQueen, Celine Pribil, Bram Ramaekers, Stephane Regnier, Josh Roth, Haitham Tuffaha, Remon van den Broek, Rick Vreman, Nicky Welton. Many thanks to Elizabeth Molsen-David at ISPOR for her continuous support from start to finish. Thanks also to Michael Mersky for his contribution with the design of Figure 1 and Rebecca Shaw for editorial support.
      Financial disclosures: All authors volunteered their time for participation in this task force. AB would like to acknowledge support from NIH-NHLBI research grant (R01 HL126804) for his time. No other authors received financial support for their participation.
      Responsibility: Please note that the opinions expressed in this paper represent those of the authors and do not necessarily reflect those of their employers.

      Supplemental Materials

      References

        • Steuten L.M.G.
        • Van de Wetering G.
        • Groothuis-Oudshoorn K.
        • Retèl V.
        A systematic and critical review of the evolving methods and applications of value of information in academia and practice.
        Pharmacoeconomics. 2017; 31: 25-48
        • Bindels J.
        • Ramaekers B.
        • Ramos I.C.
        • et al.
        Use of value of information in healthcare decision making: exploring multiple perspectives.
        Pharmacoeconomics. 2016; 34: 315-322
        • Claxton K.
        • Eggington S.
        • Ginnelly L.
        • et al.
        A pilot study of value of information analysis to support research recommendations for NICE.
        (CHE Research Paper 4)
        • Rothery C.
        • Strong M.
        • Koffijberg H.
        • et al.
        Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force.
        Due for publication, 2020
        • Schlaifer R.
        Probability and Statistics for Business Decisions: An Introduction to Managerial Economics Under Uncertainty.
        McGraw-Hill, New York, NY1959
        • Raiffa H.
        • Schlaifer R.O.
        Applied Statistical Decision Theory.
        Harvard University Press, Cambridge, MA1961
        • Howard R.A.
        Information value theory.
        IEEE Transact Systems Science and Cybernetics. 1966; 2: 122-126
        • Raiffa H.
        Decision Analysis: Introductory Lectures on Choices Under Uncertainty.
        Addison-Wesley, New York, NY1968
        • Stigler G.J.
        The economics of information.
        J Polit Econ. 1961; 69: 213-225
        • Claxton K.
        The irrelevance of inference: a decision making approach to the stochastic evaluation of health care technologies.
        J Health Econ. 1999; 18: 341-364
        • Meltzer D.
        Addressing uncertainty in medical cost-effectiveness analysis. Implications of expected utility maximisation for methods to perform sensitivity analysis and the use of cost-effectiveness analysis to set priorities for medical research.
        J Health Econ. 2001; 20: 109-129
        • Claxton K.P.
        • Sculpher M.J.
        Using value of information analysis to prioritise health research: some lessons from recent UK experience.
        Pharmacoeconomics. 2006; 24: 1055-1068
        • Colbourn T.
        • Asseburg C.
        • Bojke L.
        • et al.
        Prenatal screening and treatment strategies to prevent group B streptococcal and other bacterial infections in early infancy: cost-effectiveness and expected value of information analyses.
        Health Technol Assess. 2007; 11: 1-226
        • Hassan C.
        • Hunink M.G.
        • Laghi A.
        • et al.
        Value-of-information analysis to guide future research in colorectal cancer screening.
        Radiology. 2009; 253: 745-752
        • Basu A.
        • Meltzer D.
        Value of information on preference heterogeneity and individualized care.
        Med Decis Making. 2007; 27: 112-127
        • Basu A.
        • Carlson J.J.
        • Veenstra D.L.
        A framework for prioritizing research investments in precision medicine.
        Med Decis Making. 2016; 36: 567-580
        • Breeze P.
        • Brennan A.
        Valuing trial designs from a pharmaceutical perspective using value based pricing.
        Health Econ. 2015; 24: 1468-1482
        • Tuffaha H.W.
        • Roberts S.
        • Chaboyer W.
        • Gordon L.G.
        • Scuffham P.A.
        Cost-effectiveness and value of information analysis of nutritional support for preventing pressure ulcers in high-risk patients: implement now, research later.
        Appl Health Econ Health Policy. 2015; 13: 167-179
        • Eeren H.V.
        • Schawo S.J.
        • Scholte R.H.J.
        • Busschbach J.J.V.
        • Hakkaart L.
        Value of information analysis applied to the economic evaluation of interventions aimed at reducing juvenile delinquency: an illustration.
        PLoS ONE. 2015; 10
        • Conti S.
        • Claxton K.
        Dimensions of design space: a decision-theoretic approach to optimal research design.
        Med Decis Making. 2009; 29: 643-660
        • Griffin S.
        • Welton N.J.
        • Claxton K.
        Exploring the research decision space: the expected value of information for sequential research designs.
        Med Decis Making. 2010; 30: 155-162
        • Fenwick E.
        • Claxton K.
        • Sculpher M.J.
        The value of implementation and the value of information: combined and uneven development.
        Med Decis Making. 2008; 28: 21-32
        • Grimm S.
        • Strong M.
        • Brennan A.
        • Wailoo A.
        Framework for analyzing risk in health technology assessments and its application to managed entry agreements.
        (Report by the Decision Support Unit)
        • Grimm S.
        • Strong M.
        • Brennan A.
        • Wailoo A.
        The HTA risk analysis chart: visualizing the need for and potential value of managed entry agreements in health technology assessment.
        Pharmacoeconomics. 2017; 35: 1287-1296
        • Bansal A.
        • Basu A.
        Value of information methods for optimal timing of biomarker collection.
        Value Health. 2016; : A1-A318
        • Sculpher M.
        Subgroups and heterogeneity in cost-effectiveness analysis.
        Pharmacoeconomics. 2008; 26: 799-806
        • van Gestel A.
        • Grutters J.
        • Schouten J.
        • et al.
        The role of the expected value of individualized care in cost-effectiveness analyses and decision making.
        Value Health. 2012; 15: 13-21
        • Espinoza M.A.
        • Manca A.
        • Claxton K.
        • Sculpher M.J.
        The value of heterogeneity for cost-effectiveness subgroup analysis.
        Med Decis Making. 2014; 34: 951-964
        • Garrison L.P.
        • Towse A.
        • Briggs A.
        • et al.
        Performance-based risk-sharing arrangements-good practices for design, implementation, and evaluation: report of the ISPOR good practices for performance-based risk-sharing arrangements task force.
        Value Health. 2013; 16: 703-719
        • Bennette C.S.
        • Veenstra D.L.
        • Basu A.
        • Baker L.H.
        • Ramsey S.D.
        • Carlson J.J.
        Development and evaluation of an approach to using value of information analyses for real-time prioritization decisions within SWOG, a large cancer clinical trials cooperative group.
        Med Decis Making. 2016; 36: 641-651
        • Tuffaha H.W.
        • Gordon L.G.
        • Scuffham P.A.
        Value of information analysis informing adoption and research decisions in a portfolio of health care interventions.
        MDM Policy & Practice. 2016; 1
        • Hoomans T.
        • Seidenfeld J.
        • Basu A.
        • Meltzer D.
        Systematizing the use of value of information analysis in prioritizing systematic reviews.
        Agency for Healthcare Research and Quality (US), Rockville (MD)2012 Aug (Report No.: 12-EHC109-EF)
        • Nixon R.M.
        • O'Hagan A.
        • Oakley J.
        • et al.
        The rheumatoid arthritis drug development model: a case study in Bayesian clinical trial simulation.
        Pharm Stat. 2009; 8: 371-389
        • Thorn J.
        • Coast J.
        • Andronis L.
        Interpretation of the expected value of perfect information and research recommendations: a systematic review and empirical investigation.
        Med Decis Making. 2016; 36: 285-295
        • Koffijberg H.
        • Rothery C.
        • Chalkidou K.
        • Grutters J.
        Value of information choices that influence estimates: a systematic review of prevailing considerations.
        Med Decis Making. 2018; 38: 888-900
        • Briggs A.H.
        • Sculpher M.J.
        • Claxton K.
        Decision Modelling for Health Economic Evaluation.
        Oxford University Press, Oxford, UK2006
        • Tuffaha H.W.
        • Gordon L.G.
        • Scuffham P.A.
        Value of information analysis in healthcare: a review of principles and applications.
        J Med Econ. 2014; 17: 377-388
        • Eckermann S.
        • Karnon J.
        • Willan A.R.
        The value of value of information best informing research design and prioritization using current methods.
        Pharmacoeconomics. 2010; 28: 699-709
        • Claxton K.
        • Ginnelly L.
        • Sculpher M.J.
        • Philips Z.
        • Palmer S.
        A pilot study on the use of decision theory and value of information analysis as part of the NHS Health Technology Assessment programme.
        Health Technol Assess. 2004; 8: 1-103
        • Myers E.
        • McBroom A.J.
        • Shen L.
        • Posey R.E.
        • Gray R.
        • Sanders G.D.
        Value-of-information analysis for patient-centered outcomes research prioritization. Duke Evidence-based Practice Center Durham.
        • Carlson J.J.
        • Thariani R.
        • Roth J.
        • et al.
        Value of information analysis within a stakeholder-driven research prioritization process in a US setting: an application in cancer genomics.
        Med Decis Making. 2013; 33: 463-471
        • Sanders G.D.
        • Basu A.
        • Myers E.
        • Meltzer D.
        Potential value of an aspirin-dose trial for secondary prevention of coronary artery disease: informing PCORI and future trial design.
        Circulation. 2016; 134: A20405
        • Chalkidou K.
        • Lord J.
        • Fischer A.
        • Littlejohns P.
        Evidence-based decision making: when should we wait for more information?.
        Health Aff. 2008; 27: 1642-1653
        • Claxton K.
        • Palmer S.
        • Longworth L.
        • et al.
        Informing a decision framework for when NICE should recommend the use of health technologies only in the context of an appropriately designed programme of evidence development.
        Health Technol Assess. 2012; 16: 1-323
        • Al M.
        • Bindels J.
        • Ramos I.C.
        • et al.
        Uncertainty and value of information. Guideline for the conduct of economic evaluations in health care, Dutch version.
        • McCabe C.J.
        • Stafinski T.
        • Edlin R.
        • Menon D.
        Access with evidence development schemes: a framework for description and evaluation.
        Pharmacoeconomics. 2010; 28: 143-152
        • Stafinski T.
        • McCabe C.J.
        • Menon D.
        Funding the unfundable: mechanisms for managing uncertainty in decisions on the introduction of new and innovative technologies into healthcare systems.
        Pharmacoeconomics. 2010; 28: 113-142
        • Walker S.
        • Sculpher M.
        • Claxton K.
        • Palmer S.
        Coverage with evidence development, only in research, risk sharing or patient access scheme? A framework for coverage decisions.
        Value Health. 2012; 15: 570-579
        • McKenna C.
        • Soares M.
        • Claxton K.
        • et al.
        Unifying research and reimbursement decisions: case studies demonstrating the sequence of assessment and judgments required.
        Value Health. 2015; 18: 865-875
        • Sculpher M.
        • Drummond M.
        • Buxton M.
        The iterative use of economic evaluation as part of the process of health technology assessment.
        J Health Serv Res Policy. 1997; 2: 26-30
        • Fenwick E.
        • Claxton K.
        • Sculpher M.
        • Briggs A.
        Improving the efficiency and relevance of health technology assessment: the role of iterative decision analytic modelling.
        (CHE Discussion Paper series, no 179)
        • Miller P.
        Role of pharmacoeconomic analysis in R&D decision making. When, where, how?.
        Pharmacoeconomics. 2005; 23: 1-12
        • Mohseninejad L.
        • Feenstra T.
        • van der Horst H.E.
        • Woutersen-Koch H.
        • Buskens E.
        Targeted screening for coeliac disease among irritable bowel syndrome patients: analysis of cost-effectiveness and value of information.
        Eur J Health Econ. 2013; 14: 947-957
        • Mohseninejad L.
        • van Gils C.
        • Uyl-de Groot C.A.
        • Buskens E.
        • Feenstra T.
        Evaluation of patient registries supporting reimbursement decisions: the case of oxaliplatin for treatment of stage III colon cancer.
        Value Health. 2015; 18: 84-90
        • Baird L.G.
        • Banken R.
        • Eichler H.G.
        • et al.
        Accelerated access to innovative medicines for patients in need.
        Clin Pharmacol Ther. 2014; 96: 559-571
        • European Medicines Agency
        Adaptive pathways workshop, report on a meeting with stakeholders held at EMA on Thursday 8 December 2016.
        • Retèl V.P.
        • Grutters J.P.C.
        • van Harten W.H.
        • Joore M.A.
        Value of research and value of development in early assessments of new medical technologies.
        Value Health. 2013; 16: 720-728
        • McKenna C.
        • Claxton K.
        Addressing adoption and research design decisions simultaneously: the role of value of sample information analysis.
        Med Decis Making. 2011; 31: 853-865
        • Briggs A.H.
        • Weinstein M.C.
        • Fenwick E.A.L.
        • Karnon J.
        • Sculpher M.J.
        • Paltiel A.D.
        Model parameter estimation and uncertainty: a report of the ISPOR-SMDM modeling good research practices task force-6.
        Value Health. 2012; 15: 835-842
        • Strong M.
        • Oakley J.E.
        When is a model good enough? Deriving the expected value of model improvement via specifying internal model discrepancies.
        SIAM/ASA J Uncertainty Quantification. 2014; 2: 106-125
        • Ghabri S.
        • Hamers F.F.
        • Josselin J.M.
        Exploring uncertainty in economic evaluations of drugs and medical devices: lessons from the first review of manufacturers’ submissions to the French National Authority for Health.
        Pharmacoeconomics. 2016; 34: 617
        • Drummond M.F.
        • Sculpher M.J.
        • Claxton K.
        • Stoddart G.L.
        • Torrance G.W.
        Methods for The Economic Evaluation of Health Care Programmes.
        Oxford University Press, Oxford, UK2015
        • Grutters J.P.
        • Sculpher M.
        • Briggs A.H.
        • et al.
        Acknowledging patient heterogeneity in economic evaluation: a systematic literature review.
        Pharmacoeconomics. 2013; 31: 111-123
        • Roberts M.
        • Russell L.B.
        • Paltiel A.D.
        • Chambers M.
        • McEwan P.
        • Krahn M.
        Conceptualizing a model: a report of the ISPOR-SMDM modeling good research practices task force-2.
        Value Health. 2012; 15: 804-811
        • O'Hagan A.
        • Buck C.E.
        • Daneshkhah A.
        • et al.
        Uncertain Judgements: Eliciting Experts’ Probabilities.
        Wiley, Chichester, UK2006
        • Oakley J.E.
        Eliciting univariate probability distributions.
        in: Rethinking Risk Measurement and Reporting. 1. Risk Books, London, UK2010
        • Daneshkhah A.
        • Oakley J.E.
        Eliciting multivariate probability distributions.
        in: Rethinking Risk Measurement and Reporting. 1. Risk Books, London, UK2010
        • Neumann P.J.
        • Ganiats T.G.
        • Russell L.B.
        • Sanders G.D.
        • Siegel J.E.
        Cost-Effectiveness in Health and Medicine.
        2nd ed. Oxford University Press, New York, NY2017
        • Wilson E.C.
        A practical guide to value of information analysis.
        Pharmacoeconomics. 2015; 33: 105-121
        • Philips Z.
        • Claxton K.
        • Palmer S.
        The half-life of truth: what are appropriate time horizons for research decisions?.
        Med Decis Making. 2008; 28: 287-299
        • Willan A.R.
        • Pinto E.M.
        The value of information and optimal clinical trial designs.
        Stat Med. 2005; 24: 1791-1806
        • Willan A.
        • Briggs A.
        Power and size determination: the value of information approach.
        Statistical Analysis of Cost-Effectiveness Data. Wiley, Chichester, UK2006
        • Eckermann S.
        • Willan A.R.
        Globally optimal trial design for local decision making.
        Health Econ. 2009; 18: 203-216
        • Eckermann S.
        • Willan A.R.
        Optimal global value of information trials: better aligning manufacturer and decision maker interests and enabling feasible risk sharing.
        Pharmacoeconomics. 2013; 31: 393-401
        • Claxton K.
        Value of information analysis.
        in: Encyclopedia of Health Economics. 2. Elsevier, Waltham, MA2014: 53-60
        • Claxton K.
        • Posnett J.
        An economic approach to clinical trial design and research priority-setting.
        Health Econ. 1996; 5: 513-524
        • Welton N.J.
        • Madan J.J.
        • Caldwell D.M.
        • Peters T.J.
        • Ades A.E.
        Expected value of sample information for multi-arm cluster randomized trials with binary outcomes.
        Med Decis Making. 2014; 34: 352-365
        • Hind D.
        • Reeves B.C.
        • Bathers S.
        • et al.
        Comparative costs and activity from a sample of UK clinical trials units.
        Trials. 2017; 18: 203
        • van Asselt T.
        • Ramaekers B.
        • Corro Ramos I.
        • et al.
        Research costs investigated: a study into the budgets of Dutch publicly funded drug-related research.
        Pharmacoeconomics. 2018; 36: 105-113
        • Willan A.R.
        • Goeree R.
        • Boutis K.
        Value of information methods for planning and analyzing clinical studies to optimize decision making and planning.
        J Clin Epidemiol. 2012; 65: 870-876
        • Fenwick E.
        • Claxton K.
        • Sculpher M.
        Representing uncertainty: the role of cost-effectiveness acceptability curves.
        Health Econ. 2001; 10: 779-787
        • Al M.J.
        • Feenstra T.L.
        • Van Hout B.A.
        Optimal allocation of resources over health care programmes: dealing with decreasing marginal utility and uncertainty.
        Health Econ. 2005; 14: 655-667
        • Basu A.
        • Meltzer D.
        Decision criterion and value of information analysis: optimal aspirin dosage for secondary prevention of cardiovascular events.
        Med Decis Making. 2018; 38: 427-438
        • Koffijberg H.
        • Knies S.
        • Janssen M.P.
        The impact of decision makers’ constraints on the outcome of value of information analysis.
        Value Health. 2018; 21: 203-209
        • Hanney S.
        • Mugford M.
        • Grant J.
        • Buxton M.
        Assessing the benefits of health research: lessons from research into the use of antenatal corticosteroids for the prevention of neonatal respiratory distress syndrome.
        Soc Sci Med. 2005; 60: 937-947
        • Detsky A.S.
        Using economic analysis to determine the resource consequences of choices made in planning clinical trials.
        J. Chronic Disease. 1985; 38: 753-765
        • Detsky A.S.
        Using cost-effectiveness analysis to improve the efficiency of allocating funds to clinical trials.
        Stat Med. 1990; 9: 173-184
        • Drummond M.F.
        • Davies L.M.
        • Ferris F.L.
        Assessing the costs and benefits of medical research: the diabetic retinopathy study.
        Soc Sci Med. 1992; 34: 973-981
        • Davies L.
        • Drummond M.
        • Papanikoloau P.
        Prioritising investments in health technology assessment: can we assess potential value for money?.
        Int J Technol Assess Health Care. 2000; 16: 73-91
        • Chilcott J.
        • Brennan A.
        • Booth A.
        • Karnon J.
        • Tappenden P.
        The role of modelling in prioritising and planning clinical trials.
        Health Technol Assess. 2003; 7: 1-125
        • Andronis L.
        • Barton P.
        Adjusting estimates of the expected value of information for implementation: theoretical framework and practical application.
        Med Decis Making. 2016; 36: 296-307
        • Hoomans T.
        • Fenwick E.A.L.
        • Palmer S.
        • Claxton K.
        Value of information and value of implementation: application of an analytic framework to inform resource allocation decisions in metastatic hormone-refractory prostate cancer.
        Value Health. 2009; 12: 315-324
        • Grimm S.
        • Dixon S.
        • Stevens S.
        Assessing the expected value of research studies in reducing uncertainty and improving implementation dynamics.
        Med Decis Making. 2017; 37: 523-533