By Tina B. Stacy, PharmD, BCOP, CHCP; Kevin Obholz, PhD; June Wasserstrom; Timothy Quill, PhD, Clinical Care Options LLC
Adult learning theory has shown that the maintenance and improvement of clinician knowledge, competence and performance is more effective when, after being exposed to a new concept, that concept is reinforced over regular intervals on an ongoing basis. However, reinforcement-based education that incorporates “traditional” methods is often not enough for oncology/hematology specialists practicing in a therapeutic landscape in which new data are constantly being published and presented and the standards of care are changing frequently. In 2017 alone, the FDA added 16 new agents to its current list of more than 200 approved anticancer drugs, as well as additional indications for 28 existing drugs and several new cancer diagnostic tests.
Availability of hundreds of new therapeutic options during the last decade has added significant complexity to oncology treatment decision making. Moreover, caring for patients with cancer in an information-overloaded environment with hundreds of daily emails, peer-reviewed journals, news reports, and often misinformed patients further contributes to the challenge and educational needs. Effective and proven educational methods, such as didactic lectures and technical skill training, on their own may not change behavior and improve patient care for oncology/hematology specialists practicing in this current environment. Indeed, novel educational resources are needed that closely mimic daily practice, promote an active learning involvement, have built-in ability for reflection and directly inform optimal treatment decisions for specific patients.
Clinical Practice Guidelines and Other Compendia
Clinical practice guidelines and other compendia are reliable and familiar resources, which include treatment recommendations that help oncologists make evidence-based decisions and translate cutting-edge advances into practice. Clinical practice guidelines have evolved as standard tools to support evidence-based medicine, reduce variability in clinical practice and improve the quality of oncology care.[2,3] Although they are updated frequently, the standardized structure of oncology guidelines can limit the ability of the included treatment recommendations to map adequately to the complex clinical diversity of individual oncology patients. In addition, the recommendations from these respected clinical guidelines provide the standard by which most reimbursement decisions are made.[7,8] Thus, treatment recommendations in clinical guidelines are often as inclusive as possible, and oncology/hematology specialists using them must choose from among multiple “reasonable” therapeutic options that, in practice, may be insufficiently adaptive to unique patient and disease characteristics.
Educational Solution to the Clinical Challenge: The Interactive Decision Support Tool (IDST)
IDSTs, in conjunction with “traditional” educational formats, offer a means to narrow the gap between clinical practice guideline recommendations and individualized treatment decision making. To be effective in generating significant improvements in clinical decision making, treatment guidance resources must involve multiple topic experts in the translation of research into practice and actively offer evidence-justified, patient-specific advice that encourages learners to modify behaviors or increase confidence through reinforcement of effective practice.[8-10] Accordingly, Clinical Care Options (CCO) recognized the need for an innovative approach and developed entirely new software for an extensive series of tumor-specific IDSTs, each authored by a panel of multiple topic experts, to address changing treatment paradigms in oncology and address gaps in guideline specificity across a range of tumor types.
Developing a CCO IDST and How to Use Them
CCO’s IDSTs are developed by five topic experts to include hundreds to thousands of unique case variations based on factors that the experts consider important for treatment selection. The experts each then provide a treatment recommendation for every patient case variation in that particular IDST. To use the IDST, learners are prompted to enter patient/disease information from pull-down menus and then indicate their planned clinical approach for the patient case entered into the tool. After this information is entered, a single treatment recommendation is provided from each of the five experts and displayed for the specific case. Learners are then asked to reflect on these recommendations and indicate whether the expert recommendations will now change their original planned clinical approach.
Our hypothesis was that individualized, consensus recommendations (≥ three known and trusted experts recommending the same treatment) will change clinician behavior and subsequent care for the cases entered into the IDST. To optimize learning, our IDSTs were designed according to the following principles of clinical education[11-13]:
- Expert guidance is distilled in an accessible, readily usable online and mobile formats.
- Users can access the tool when they are ready to learn (i.e., when they have a challenging case).
- Clinicians using the tool must indicate their intended treatment to capture trends in current practice and are provided an opportunity to reflect in comparison of their planned treatment with the expert recommendations.
- Expert recommendations provide feedback and specific guidance for learners on their practice.
- Assessment following tool use captures and reinforces the impact of expert recommendations on learner intentions to change their practice.
- Ongoing educational needs are pinpointed via the comparison of learner intended treatment with expert consensus recommendations data over time.
The Clinical Impact: An IDST Meta-Analysis
To further explore the utility of IDSTs as an educational resource, CCO conducted a meta-analysis of 21 IDSTs, each with treatment recommendations for thousands of case scenarios across multiple disease treatment settings. These 21 distinct IDSTs covered 10 different malignancies or clinical care issues and each included an analysis designed to measure their effectiveness and clinical impact.
Actively practicing healthcare providers (HCPs) entered 29,286 specific patient cases into the IDSTs. When analyzing clinician confidence in their intended treatment, HCPs reported that they were uncertain how to optimally treat their patient for 12 percent of the cases entered into the IDSTs. Across all disease treatment settings, 3,514 patients were potentially at risk for suboptimal treatment as a result of this uncertainty.
We further examined 11,945 patient cases for which there was a treatment recommendation that had a consensus among the experts (≥ three experts recommended the same treatment for the same patient). The intended treatment of HCPs for 47 percent of these cases differed from the expert consensus recommendation, again indicating that these patients (n = 5571) were at risk for suboptimal treatment.
Our hypothesis was that the use of IDSTs would have an impact on actual clinical practice. As part of the IDST design, we captured tool impact and changes in learners’ treatment planning intentions by offering an optional survey following each tool interaction. In almost one half of the cases (41 percent) across our IDSTs, HCPs reported that they changed their treatment plan for a specific case in response to the customized expert recommendations they received after using the IDST.
In addition, IDST survey data indicate that approximately 38 percent of HCPs have used the tools to get treatment advice for a specific patient in their practice and 62 percent used the IDSTs as an educational resource and entered a hypothetical patient. This finding underscores the power of IDSTs to support ongoing education and as guidance for clinical decision making in real-world patient care.
A New Innovative Educational Format
As a concept of providing an educational resource that can inform “optimal” instead of “reasonable” or “on guideline” clinical decision making, the CCO IDST has demonstrated substantial practical and educational value. The need for education is remarkable for oncology and hematology specialists. CCO’s meta-analysis indicated that before using the IDSTs, thousands of patients were at risk of receiving suboptimal care. After entering their patient cases into our IDSTs and then reflecting on the treatment recommendations provided, HCPs caring for many of these at-risk patients changed their treatment approach to match that of the experts.
Based on CCO’s analysis, the design of any CME/CPD program targeted to clinicians practicing in a rapidly evolving therapeutic landscape with multiple reasonable treatment options (like oncology) should include elements, tools, and/or resources that provide customized, patient-specific expert guidance that support other “traditional” learning modalities and have the ability to influence real-time clinical decision making.
- The American Society of Clinical Oncology. The state of cancer care in America, 2017: a report by the American Society of Clinical Oncology. J Oncol Pract. 2017;13:e353-e394.
- Francke AL, Smit MC, de Veer AJ, et al. Factors influencing the implementation of clinical guidelines for health care professionals: a systematic meta-review. BMC Med Inform Decis Mak. 2008;8:38.
- Sacket DL, Richardson WS, Rosenberg W, et al. Evidence-based medicine: how to practice and teach EBM. Edinburgh, United Kingdom: Churchill Livingstone; 2000.
- Cabana MD, Rand CS, Powe NR, et al. Why don’t physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999;282:1458-1465.
- Brouwers MC, Makarski J, Garcia K, et al. A mixed methods approach to understand variation in lung cancer practice and the role of guidelines. Implement Sci. 2014;9:36.
- Phillips LS, Branch WT, Cook CB, et al. Clinical inertia. Ann Intern Med. 2001;135:825-834.
- Greenhalgh T, Howick J, Maskrey N. Evidence based medicine: a movement in crisis? BMJ. 2014;348:g3725.
- Fox J, Patkar V, Chronakis I, et al. From practice guidelines to clinical decision support: closing the loop. J R Soc Med. 2009;102:464-473.
- Veloski J, Boex JR, Grasberger MJ, et al. Systematic review of the literature on assessment, feedback and physicians’ clinical performance: BEME Guide No. 7. Med Teach. 2006;28:117-128.
- Cantillon P, Sargeant J. Giving feedback in clinical settings. BMJ. 2008;337:a1961. Moulding NT, Silagy CA, Weller DP. A framework for effective management of change in clinical practice: dissemination and implementation of clinical practice guidelines. Qual Health Care. 1999;8:177-183.
- Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(10 suppl):S46-S54.
- Davis DA, Mazmanian PE, Fordis M, et al. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. 2006;296:1094-1102.