By Jacob Coverstone and Erin Schwarz
The ACCME updated the reaccreditation process in early 2018. To discover the extend of the changes and provide guidance to readers, Erin Schwarz and Jacob Coverstone, editors representing the Almanac, spoke with Dion Richetti, vice president of accreditation and recognition, and Teri McCauley, manager of accreditation and recognition systems with the ACCME.
According to Richetti, even though things are different, little is changing. After speaking with the ACCME … we kind of agree.
Since announcing the changes, the ACCME has held webinars for providers preparing for reaccreditation, but we at the Almanac want to prepare providers at all stages of accreditation for the changes. (See a short summary of the changes at the bottom of this article.)
The following series is taken from a conversation that confronted assumptions, challenged processes and (we hope) clarifies ACCME reaccreditation going forward.
Thank you for making time today. We wanted to discuss the changes that are being made to the ACCME self-study/reaccreditation process.
We really appreciate the opportunity to talk with you and share with your readers what we consider to be good news about enhancements to PARS.
We want to clarify first, we’re also interested in changes to the initial application process. Our first question is in two parts:
- What prompted the changes to the self-study outline, both for initial and reaccreditation?
- What was the internal process for determining the changes that were needed, such as modifications to the questions in the accreditation outline?
Those are great questions. I want to preface our comments with one thing — when you said “the process for accreditation and reaccreditation” — we’re not changing the process. The process is the same. The three data points remain the same: the self-study, performance in practice and the interview.
The only thing changing in terms of what the ACCME is doing is that we’re responding to the community’s request to put everything into a more efficient, secure online system of collecting the information. That’s the primary difference.
At the same time, to address your current question, obviously we’ve been using, since roughly 2009–2010, a format for the outline for the self-study, which over the years has changed many times. What’s been consistent for several years is that we’d asked providers to use two examples to discuss how they planned their CME activities.
One of the things we’ve heard is that this approach was somewhat limiting — that the two examples didn’t always provide surveyors with all of the information they needed.
For instance, providers might have five ways that they produce CME activities — and very often that wouldn’t necessarily be captured in all of the performance-in-practice files. Or it would be, in the performance-in-practice files, but wouldn’t be described in the self-study.
So, the primary change to the questions in the outline for the self-study is to ask providers to describe how it is that they plan their CME activities, rather than limit it to two examples. That gives them the opportunity to provide full content to the surveyors.
We didn’t see it as a major change but more as a reaction to the needs of the surveyors and our Accreditation Review Committee (ARC) members. Our volunteers need to be able to understand the full context of a provider’s CME program.
Does that help?
Yes, especially the explanation that the process itself is not actually changing and retains the same basic elements. That’s an important point, and one that the CME community should note.
The previous self-study process was in place for about seven years. But in that format, when the instructions said “give us two examples,” it included that, if a provider felt those examples were insufficient, that they could add a “Section B” to a criterion and include additional examples.
Were you finding that providers weren’t utilizing that opportunity?
It was very rare that providers actually gave a more comprehensive response. One of the things that’s important to understand about accreditation is that most of the providers will give the information that’s requested. So, when a part of the process is presented as “optional,” it adds an element of subjectivity and typically providers err toward answering only those questions that are required.
Instead, we say, “tell us how you plan your activities,” and then it’s up to the provider to give the complete picture. And, obviously, the point of the accreditation interview is to clarify anything within the materials, and that’s always an option. It always has been.
That inspires a couple of follow-up questions:
We understand that it’s a provider’s responsibility to submit enough information to help ACCME to decide. In the past, it hasn’t gone well when providers submitted vague responses; you like answers to be specific and clear. Submitting examples tended to help with that clarity.
Do you have recommendations for organizations to ensure that the information they provide is specific enough if providers are no longer being encouraged to provide examples? As you noted, providers will naturally follow the outline and tend not to do what’s optional.
Asking for examples was a tactic that ACCME employed to help providers convey a picture of their planning process.
What we have now are specific requests for information. If you look at the current outline — “describe the process or processes used to identify the professional practice gaps of your learners and the educational needs that underlie the identified professional practice gaps” — that doesn’t limit you to two, and it says “describe the process or processes.”
Now again, the more that we would quantify things, the less flexible the outline becomes. So we felt that the new outline would reduce the burden for surveyors to try to find out where, within a self-study report, a provider tried to talk about how they address professional practice gaps. At the same time, we wanted to allow the provider to give a complete picture of all of the things that they’re going to show in the performance-in-practice files.
Was there a deliberate decision to eliminating the ability for providers to include things like screenshots and graphics, or was that a limitation of this new, online platform?
There was never a requirement for providers to give screenshots or graphics.
As you know, we have a criterion-referenced system. We’re trying to create a platform in which everyone has an equal opportunity to give the ACCME the information they need in order to address the criteria.
None of our questions asked for screenshots or graphics. We’re asking everyone to describe, using language, and to show, using evidence and description in the performance in practice, what they actually do.
We received a lot of feedback from surveyors stating that it was often difficult — because every self-study looked so different — to find what was most relevant. Ultimately, we’ve always asked for descriptions and evidence, and we’re still asking for descriptions and evidence. We believe this approach will simplify the process and create a more uniform approach that doesn’t offer advantages to individual providers who have the resources to create [enhanced] materials.
Summary of Changes to ACCME Accreditation Process
- All documentation will be submitted online, through the PARS system, including a) self-study narrative and b) performance-in-practice files.
- The self-study outline has been modified to eliminate the “two example” requirement for Criteria 2-6 and focuses questions on provider’s process or processes for each Criteria (click here to see the outline for the July in November 2019 cohorts).
- The performance-in-practice files require similar information as was previously requested in the structured abstract (click here to see the version for the July and November 2019 cohorts).
- The ACCME requests that providers complete a template Excel file, rather than submitting individual completed disclosure forms or other documentation.