By Jack Kues, PhD, FAACEHP, past president of the Alliance, Associate Dean for Continuous Professional Development, University of Cincinnati
The Alliance’s Quality Improvement Education (QIE) initiative describes the connection between quality improvement and continuing professional development — and has done so for years. Other concepts and initiatives have been functioning in a similar space, including quality improvement, Performance Improvement CME (PICME), Maintenance of Certification (Part 4), Accountable Care Organizations and quality-based reimbursement. As educators, we’re challenged to demonstrate changes in practice and improvement in patient care and outcomes associated with our activities. Our ability to measure long-term impact has been limited by the lack of tools and the expense of follow-up with our learners over an extended period.
Recently, investigators at the American Board of Family Medicine reported their strategy and findings related to the participation of their diplomates in MOC Performance in Practice modules (Peterson et al. Physician Satisfaction With and Practice Changes Resulting From American Board of Family Medicine Maintenance of Certification Performance in Practice Modules, JCEHP, 2016, 36(1), 55-60.) This study provides useful findings and tools that may help educators when developing activities that include components for QI or PICME.
The authors based their findings primarily on self-report data related to satisfaction with the learning activity and their plans for future behavior from longitudinal dataset of almost 30,000 learners over more than a decade.
Structure of the Evaluation Tool
Learners rated activities on:
- Relevance of the topic to their practice
- Currency of clinical information
- Usefulness of clinical information to practice
- Overall rating
There were also two yes/no questions:
- Do you plan to change care as a result of the PPM (if yes, how)?
- Do you plan to continue conducting quality assessment and improvement activities in your practice?
All of the evaluation metrics received very high positive responses with minor variations for the clinical topics areas across the different PPMs.
The last question, “Do you plan to continue conducting quality assessment and improvement activities in your practice?”, provides a great adjunct to the traditional “intent to change” assessment. This single question may be our best indication of learners’ self-efficacy related to implementing QI-type changes in their practice. It provides a practical indicator of the perseverance of change and development of an improvement culture within the learner’s practice. These are important dimensions related to the ability of education to have a lasting impact on learners and their practice.
Peterson and colleagues provided a detailed description of their analysis of the qualitative responses, describing what specific changes learners planned to make in their practice. The authors employed a qualitative data analysis software application (MAXQDA 11) to generate word-frequency reports. This method of analyzing qualitative data may not be the most sophisticated for identifying underlying themes, but it could be very useful for uncovering the frequency of general topics or types of comments. I can’t speak to its cost or the learning curve necessary to employ it, but it might be worth a cursory review. In general, the return on investment of this level of analysis becomes viable when you’ve accumulated a sizable amount of data, as was the case with this study.
One peculiar finding of this study was that, as learners participated in more PPMs, they were less likely to report that they were going to change practice. However, measures of their intention to continue conducting quality assessment and improvement activities did not change. The authors suggest that perhaps practices that develop a culture of improvement make smaller and fewer changes over time because they are generally functioning at a higher level of efficiency.
As healthcare delivery shifts focus to outcomes and quality, it becomes more important for practitioners to incorporate an improvement mentality as it pertains to their continuing education activities. We, as educators, will have to adjust to the needs of our learners and the systems in which they function. And, increasingly, we need to measure broader dimensions of the impact of our programs to demonstrate the value of what we do.