By Jacob Coverstone and Erin Schwarz
A continuing conversation with Dion Richetti, vice president of accreditation and recognition, and Teri McCauley, manager of accreditation and recognition systems with the AACCME. Read ACCME PARS Update, Part 1 of 3: Reaccreditation Processes.
How is ACCME helping surveyors, members of the Accreditation Review Committee (ARC) and ACCME staff transition to the new submission process, and how will they be trained to interpret the results?
“Interpret” might not be the best word because we don’t ask surveyors to interpret anything; we ask them to collect data.
Just as we transitioned the system several years ago to a PDF -based system, and then to a combination online/PDF system, we’re now going to a completely online system. Just as any new digital platform requires training and getting used to it, we anticipate that we will need to support all the providers and all volunteers in the use of these tools — but they’re simply tools.
So yes, providers, surveyors, ARC and staff are learning to use the new tools. And we’ve been training providers — demonstrating the enhancements to the accreditation process — starting with the November 2018 decision cohort. We’ve incorporated it into our ongoing self-study webinars, which take place three times a year.
Going forward, is it your expectation that you’ll keep hosting those training webinars for each cohort, or do you envision a point when you’ll ask providers to view something archived on the website?
We’re wondering about the ongoing or foreseeable future concerning transitioning folks to this new platform.
That’s a great question, and I certainly wouldn’t make a prediction beyond what we’re planning now. What we’ve found is that this was a way for us to connect with each cohort as they started, and to make that material available on the web. We felt that this was a much more accessible way to give all providers access to the same information and a live opportunity to ask questions. We think that’s great, because when you may think that you don’t have a question, but when you hear someone else’s question, you may go “oh yeah, I wish I’d asked that!”
So, we like doing these webinars, and will continue to have a venue per cohort, but we’ll see what the future holds.
It sounds like “standardization” is the theme we’re getting in all of this. Is that an accurate interpretation?
Our goal has always been to give all providers the same opportunity to demonstrate their CME program through the accreditation process. I think what’s different, in terms of opportunity, is that we’ve created — with this webinar — equal opportunity for all providers to participate in the self-study training without having to travel to Chicago [for the self-study session] and to have it be as specific to that cohort as possible.
You also mentioned before, just providing a platform that gives everyone an equal opportunity and standardizing responses … information becomes a little more consistent …. That’s why we noted that, thematically, standardization seems to be an overarching theme.
I think “simplification” has been the principle behind the ACCME’s evolution for as long as I [Dion] have been associated with the organization. And, we see this as further simplification.
The idea is to reduce burden and shipping costs for providers.
I want to ask more general questions about simplification, and you have had some experience with simplification, so we’re wondering if — through simplifying the process — have you previously encountered any unintended consequences?
For example, the simplification process that led to the performance in practice following a structured abstract reduced the amount of space for the description of professional practice gaps.
Have you experienced an unintended consequence wherein a provider may not have had sufficient opportunity to provide information that would be crucial to the decision-making process? And, if not, have you about anything similar in the shift toward the new, structured way of asking self-study questions?
And lastly, if you later learn that providers aren’t able to paint a full picture of their activity planning through this new process, how might that be remedied?
To answer your first question: no.
If you look at the compliance rates for the last seven to eight years, you’ll see that there has been an increase in compliance with Criterion 2 to the point where something like 95 percent of providers are found in compliance with C2, 3, 5 and 6, so we feel that the simplification of that process of performance in practice, if anything, has reduced the amount of extraneous information reviewed by surveyors.
Thank you. We’re trying to tie a couple of these threads together to explore where unintended consequences may emerge. Previously, in the self-study process, there was more space to provide information on professional practice gaps.
For example, if you planned a large annual conference, planned off of a multi-gap structure, you may not have the ability in that performance-in-practice file to display all of the gaps that you have identified. Whether or not each individual gap needs to be displayed is a question that needs to be interpreted on your [the ACCME’s] end, but, as a provider, I’ve always understood it that we should provide a comprehensive perspective: “Here are the gaps we are supporting.”
With a larger-scale annual conference, we may have identified 30 different gaps, and we’ve created different sessions to address them, and we don’t have some sort of thematic link between the sessions … It seems that the current performance-in-practice structure would not allow for all of that information to be presented.
Up until now, the self-study provided a place for that exposition; it seems like the opportunity to present those examples are being removed from the self-study. In essence, providers may show evidence of adhering to a planning process, but that evidence may now be incomplete. Is there any concern about that?
No, I don’t think we are concerned about that. As you know, the structured abstract has been in place for roughly four years. During that period, you’ll notice that there’s been no change. I think by the time we implemented the structured performance-in-practice abstract, providers had pretty well mastered the concept of developing CME activities that meet the professional practice gaps and underlying educational needs of their learners.
That had been well demonstrated through performance in practice, so the ACCME is not concerned that maintaining the format of the structured abstract in this digital platform is going to change anything further.
We have continued to say to providers: If you have a regularly scheduled series, and you plan that regularly scheduled series to meet many different, specific professional practice gaps and underlying educational needs, it’s perfectly acceptable to summarize those into an overall professional practice gap and underlying educational need.
Providers have come up with many different tactics to show, in a short 50-word/100-word paragraph that they are meeting that requirement. If ultimately our objective is to ensure providers are demonstrating compliance, we have had no change in the compliance with those criterion. In fact, if anything, it’s gone up.
I think the intended consequence of this change was to reduce the administrative burden on providers. Certainly providers can, and should, construct the education as far down to the level of the individual learner’s needs as they think is appropriate.
To demonstrate compliance with the accreditation requirements, that level of description is not what we’re asking for.