By Annette Schwind, MS, CHCP, and Audrie Tornow, CHCP, Paradigm Medical Communications, LLC
You’ve been there, right? You saw a project on Pinterest or YouTube and thought, “I can do that! I’m an intelligent adult. There are step-by-step directions with video/photos and a list of what I need for the project. I got this.”
Yet somehow, the final outcome is an epic fail. You end up disappointed, frustrated, embarrassed and confused.
Sometimes the first step in learning is to recognize that you don’t know as much as you think you do, or that knowing what to do and actually being able to do it are very different things. These are the very essence of gap analysis and educational needs assessment. But while failing at creative DIYs can provide laughs — mostly for others and, eventually, for yourself — the stakes are higher in medical practice.
Failure is not fun, and neither are the negative consequences that are associated. As educators, we need to reframe failure as the first step in a learning process rather than the final step in an achievement process. Thomas Edison’s quest to construct an electric light bulb was not one attempt and immediate success. When asked about failure in the midst of experimenting to reach his goal, he famously stated, “I haven’t failed. I have just found 10,000 ways that won’t work.”
Edison’s perspective is an important foundation for learning but not very practical in the context of treating patients. Clinicians need an environment in which they can experiment, because 10,000 patient failures are not acceptable. CME/CPD professionals can create interventions that allow healthcare practitioners (HCPs) to “safely fail.” In alerting them to their learning needs, HCPs gain the knowledge and competence they lack to improve their clinical performance. Here are some ways to include opportunities to learn from failure into CME/CPD activities:
Case vignettes can be used in a variety of ways within learning interventions (e.g., pre-test and post-test items, illustrations within presentations) or even as the learning intervention itself. These vignettes feature information about a patient and ask learners to make decisions about diagnosis and treatment. The complexity of the cases and how a learner interacts with them can vary from simplistic to complex, depending on disease state and the goals of the education. A simple case vignette may outline a patient’s presenting symptoms, relevant history and a few key test results, then ask learners to select the correct diagnosis from a list of options. A complex case vignette can be a series of interactive simulated clinical visits where each decision takes learners down a path that leads to better or worse outcomes for the virtual patient.
Figure 2. Example of Text-Based Case Vignette with Interactive Question
Figure 3. Example of Video-Based Case Vignette with Patient Portrayal and Branched-Logic Interactive Question
Case vignettes allow HCPs to safely fail because their answers do not affect a real patient. Wrong answers create cognitive dissonance — a discomfort resulting from the discrepancy between what the learner thought they knew to be correct, and what is actually correct. Cognitive dissonance can motivate learning, preparing HCPs to assimilate the information shared in the CME/CPD activity. By reflecting on their choices in comparison with the preferred course of action, assessing the feedback and instruction provided by the experts in the educational activity, and then applying it to the case, learners can engage in an experiential learning cycle without the risk of harm to any actual patients. The cycle can also be repeated using different case vignettes until the learner reaches an acceptable level of mastery.
Case vignettes can be an easy and inexpensive way to integrate learning from failure into CME/CPD interventions. However, there can be certain advantages to more sophisticated case simulations that involve more time and resources to implement. Incorporating photos or videos puts an actual face to the patient and can heighten the impact of answering correctly or incorrectly, making the education more meaningful. It easy to dismiss a patient who is just a name followed by text describing their problem. Also, the ability to follow through to the consequences of decisions made — whether that be a virtual patient returning for a follow-up visit because ineffective treatment led their condition worsening or eventually dying because of an adverse event from a poor medication choice — assigns value to a learner’s choices beyond correct or incorrect, more closely emulating the real-world clinical environment while still keeping the learner safe.
The above examples create safe havens for educational experimentation, but they take place only in the minds of the learners. Other educational designs that move clinical exploration into the more practical realms are needed. The second article in this series will examine how hands-on trial-and-error can motivate and facilitate learning.