|
SACME Research Team Completes CE Educator’s Toolkit
An important and unique educational resource is presented at the ACCME annual conference in May
Reported by Robert D’Antuono, MHA
The CE Educator’s Toolkit, a unique educational resource, was developed by the Society for Academic Continuing Medical Education (SACME) through an Accreditation Council for Continuing Medical Education (ACCME) research grant in fulfillment of ACCME’s strategic goal to advocate for research and scholarship in continuing education. The toolkit is a unique and comprehensive resource designed to equip educators with best practices and guidelines to deliver effective continuing education (CE). The toolkit is now available and may be downloaded from the ACCME website or by clicking on the link here.
Research Project Team & Authors
We are grateful to the following SACME members for participating on the research team and for contributing their time and talents in creating the toolkit:
Overview
Clinical knowledge is continuously shifting the scope of practice and thus, significant gaps persist in translating evidence into practice to provide compassionate and high-quality care for patients. Current CE interventions are focused on instructor-oriented approaches and have been shown to be ineffective in fostering practice change. Inadequate education efforts can thwart the cultivation of knowledge and skills in ensuring practice decisions are rooted in evidence and aligned with current clinical standards. Accredited CE and faculty development are critical factors to promote a lifelong learning mindset of behavior change and practice improvement.
CE interventions for healthcare professionals are more effective when instructional methods promote critical thinking, collaboration, and decision-making skills. These methods improve team performance and encourage behavior change among healthcare professionals. In the absence of these methods, CE can be ineffective and feel unrewarding to learners. Therefore, educators should seek to evolve their educational programs to integrate principles of active learning and activities that promote engagement to make it more appealing for today's healthcare professional learners.
This toolkit aims to provide CE leaders, educators, and healthcare professionals with best practices and guidelines to assist in the design and delivery of CE in a manner that fosters a practical and active learning approach. Although there are different approaches to CE interventions, this toolkit will focus on three key interventions based on evidence-driven investigation and consultation with medical education experts: (1) facilitation of small group learning, (2) case-based learning, and (3) reflective learning.
The first section of the toolkit provides information about how the toolkit was developed and allows readers to complete a self-assessment to identify areas of the toolkit that might be relevant to their CE practice. Section 2 of the toolkit provides a matrix of essential components to consider when planning a CE intervention/activity. The subsequent section explores how to prepare and implement the three interventions selected for the toolkit: (1) facilitation of small group learning, (2) case-based learning, and (3) reflective learning. Section 4 in the toolkit is devoted to assessing and evaluating CE interventions with suggestions for various methods of evaluation. These essentials are discussed in extraordinary detail with instructional concepts and methods to plan, organize and implement each phase. To illustrate, Section 2 discusses:
I. First steps to education intervention design
II. Learning objectives
III. IDEA principles
IV. Virtual delivery
The toolkit was created for the novice and intermediate experience levels in CE development. However, this toolkit provides additional resources, planning templates, and helpful tips for those with advanced expertise in developing CE activities.
Toolkit Use, Distribution and Suggested Citation
The toolkit may be used, distributed, or presented for non-promotional educational purposes with attribution:
Accreditation Council for Continuing Medical Education. 2022. CE Educator’s Toolkit: Evidence-based design and implementation strategies for effective continuing education. http://www.accme.org/ceeducatorstoolkit
Download your copy of the Toolkit today!
Expanding MOC/CC to Your RSS Activities
By Sahar Pastel-Daneshgar and Richard Wiggins, MD
This article discusses an academic experience, tips and tricks for expanding or getting started with MOC/CC Part 2 credits especially into RSS.
About us. The University of Utah Continuing Medical Education (UUCME) Office is part of the University of Utah School of Medicine. UUCME has Accreditation with Commendation from the ACCME and are ABMS Portfolio Program Sponsors. The UUCME is a decentralized office, meaning we work with planning partners to provide CME credit. We do not provide conference or planning services. Currently, we accredit 150+ RSSs, 100 courses, 100+ enduring materials, and 3-7 PI-CME each year. Since all our physicians must be board-certified, helping our learners maintain/continue their board certifications is important to our office.
MOC/CC Program. We communicate often that all board-certified providers without lifetime certification (or lifetime certified providers that have opted into MOC/CC) are required to participate for MOC/CC to ‘maintain’ or ‘continue’ certification. We find there is often confusion about MOC/CC and try to help clarify for our providers. For more information about ABMS and certification, see https://www.abms.org/board-certification/
We provide MOC/CC Part 2 credit for all boards aligned with ACCME. Each of the boards has additional specific requirements for what their diplomates must do. The ACCME produced a helpful Program Guide that details each of the boards’ requirements, found here: https://www.accme.org/publications/cme-for-moc-program-guide We reference this document often! Additional resources from ACCME are also helpful, here: https://www.accme.org/cme-counts-for-moc
UUCME offers many options for MOC/CC Part 2 through our already approved CME activities. We’ve had great success integrating MOC/CC Part 2 into our Regularly Scheduled Series (RSS), in particular. We have effectively increased the number of RSSs at our institution with MOC/CC Part 2 from 19 (with 2,876 credits) in 2019, to 34 (with 5,124 credits) in 2021.
Lessons for CPD Practice
1. Communication. Help providers understand MOC/CC, and that it is required of board-certified providers without lifetime certification. They are already doing most of the work for CME, so why not get those cookies?
We started by identifying activities that were already including the required MOC/CC components in their activity with engagement strategies like case discussion. We wanted to give them credit for the work they were already doing (get those cookies!). Since the self-evaluation engagement piece is already occurring, adding MOC/CC only meant adding a statement and a tracking mechanism. The educational design components, often the most difficult for us to change, were already in place.
We’ve found people are often intimidated by MOC/CC and by talking it through, outside of email or documents, helps show how straightforward it can be. And, where the complexities might lie! We’re also able to chat about what people might already be doing that we can tweak to count for MOC/CC Part 2. Once we were able to demonstrate that we could make it easy for planners and diplomates, with just a few additions into what they are already doing, we started to get planners who wanted to include the credit.
2. Ease of use. Use clear templates to reduce work for series and your office. Connect the value of MOC/CC credit to work they are already doing.
After we started with our initial series, we increased our efforts to reach out to various departments that may be interested in adding MOC/CC to their RSS events. We started by surveying all RSS directors of their interest in adding MOC/CC Part 2 into their series with our annual evaluation process. Then we asked the learners. Finally, we followed up with the interested activities. Recently, we’ve begun reaching out one-on-one to departments directly, explaining MOC/CC and why it might be beneficial for planners to include MOC/CC Part 2 in their activities. We targeted specific departments based on the number of diplomates within our system as well as the ease of implementing MOC/CC. We have presented at department faculty meetings using a templated presentation about the importance of MOC/CC and how it can easily be added to current CME-approved RSS events.
Some Boards such as ABA or ABPath, can easily be added to an RSS that is already meeting the requirements for CME credit. We also have many interdisciplinary RSS with multiple areas of care covered in the same activity, sharing expertise across specialties (e.g., surgery and internist or pediatricians and anesthesiologists). If the learners of an activity cross boards, we offer credit for all the applicable boards. Multiple boards can be accredited and submitted to for a single activity and encourage multidisciplinary approaches for improved learning opportunities.
The element of all the board’s requirements that tends to trip up planners the most is providing feedback back to learners from their engagement. We’ve found case discussion, which most of our activities were already including, to be the easiest mechanism for both assessment and feedback, where the feedback is the on-going conversation. Zoom chats can also work, as long as someone is providing feedback during the session. We’ve experimented with polling functions but found our planners soon lost track of writing questions for weekly sessions. With the size of our office and the small number on our team added to our busy planners’ schedules, we couldn’t include things that would require additional work outside of the RSS sessions, like requesting planners to review reflective statements and then provide individual feedback. We were concerned more intense strategies like that wouldn’t get completed in the busy day-to-day of our system. Our strategy has been to focus on what activities are already doing for engagement and feedback of their learners.
3. Follow-up. Follow up on adding MOC/CC, especially when it first starts until there is a good rhythm, use follow up to help clarify any issues, questions, or concerns, and to build a relationship with that series.
Once you’ve added MOC/CC Part 2 into a series, follow-up is needed to ensure it takes hold. Just like for CPD providers, MOC/CC can seem daunting at first for the RSS coordinators and directors we work with. But, it will work; and it will become routine. So, we follow-up to be sure they understand what’s needed, check-in often in the beginning, and work to make it as easy as possible for the planners to integrate the additional requirements into their activities. Typically, those additional requirements are adding statements, gaining permission from diplomates and tracking participation in the MOC/CC engagement component.
Wherever possible, we’ve developed quick, easy, and reproducible templates to share across RSS for integration. We’ve created an easy online survey for diplomates to provide their permission and board information. Adding statements has become part of our CME statement block and is checked within our usual CME process.
For tracking completion of participation, we have tried several methods. We had a few series try to do poll engagement with survey questions for each session that learners would complete, and we’d use the read out from that poll as tracking. We have had series track participation via zoom chats, which has worked as long as the presenter could comment to give feedback to learners. By far, though, what we have found most successful is providing a simple spreadsheet for CME coordinators to check off during their session as diplomate learners complete the engagement requirements. The coordinators then send us their “pass tracking” Excel file and we confirm credit has been awarded. This simple approach, completed digitally and emailed, has been the most manageable idea we’ve come up with so far.
For large functions we keep in mind that the number of diplomates is typically much smaller than the full number of participants. For some of our larger in-person courses, that also include MOC, diplomates are asked to sit in one section of the room as a group so ‘tracking’ is easier. We’ve also allowed for self-reported tracking to the CME coordinator, whereby a diplomate who engages during the session checks in with the coordinator on the way out of the learning space and confirms that they did engage in the discussion.
Once a series transitioned to virtual or hybrid formats, we reviewed how to continue offering MOC to series with a large number of attendees at each session. For these RSSs, the speaker is instructed to ask a case-based question related to the presentation and learners respond to each question via the virtual chat feature. The speaker provides a brief moment for learners to respond after the question is posed, and then the speaker will immediately give feedback as to the correct answer and why. The learner's answer(s)--right or wrong--satisfies the MOC requirement for learner self-evaluation. For this method all physician learners must have their name visible in their profile so they can be matched to their individual response. The entire session chat is saved as the "pass" tracking mechanism by the series coordinator or moderator running the virtual RSS. In turn, the chat is sent to the CME office.
To keep things as easy as possible, we’ve developed several templates to guide us and keep us consistent. Our most used templates include:
These templates can be found here:
https://drive.google.com/drive/folders/1DjQ4RicDpdpl3FSz_SBWkbxQPMwjI9aW?usp=sharing
Overall, we have found the experience of adding MOC/CC Part 2 into RSS to be beneficial for our providers, helpful for our planners, and even a little bit fun! If you are new to the MOC process, our best piece of advice is to just try it. Pick one board and one activity and see how it goes. Then refine and try again. Sound familiar? Plan-do-study-act – just be sure to get to that Do. And share with us what you learn along the way.
Sahar Pastel-Daneshgar is MOC/CC Coordinator and Richard Wiggins, MD is Associate Dean at the University of Utah School of Medicine, Continuing Medical Education Office.
AAMC Virtual Program on Teaching Quality Improvement and Patient Safety (QIPS) Competencies to FacultyReported by Robert D’Antuono, MHA
Last fall 2021, the Association for American Medical Colleges (AAMC) hosted a seven week longitudinal and virtual curriculum to train clinical faculty and other medical educators on the QIPS competencies so that they may teach quality improvement and patient safety skills to their learners (ie other faculty, residents and students). The curriculum for this program was designed by a national panel of expert medical educators convened by the AAMC over a two year period. Entitled Teaching Education for Quality (Te4Q), the panel worked to revise and update the original Te4Q curriculum to incorporate the AAMC’s Quality Improvement and Patient Safety (QIPS) Competencies for teaching faculty as well as to restructure the series for distribution using a virtual learning platform. These competencies present a roadmap for medical education curricular and professional development, performance assessment, and improvement of health care. The full report is available at https://www.aamc.org/data-reports/report/qipscompetencies.
Three outstanding projects completed by participants are presented here for your review. SACME members may wish to bring these projects to their respective department activity directors to initiate a discussion on creating similar initiatives to advance faculty learning in the AAMC QIPS competencies at their own institutions.
QIPS Teaching Project Examples
Project #1: Integrating a Quality Improvement and Patient Safety (QIPS) Curriculum into an Internal Medicine Residency
Institution
Saint Joseph Mercy Hospital-Ann Arbor, Michigan
Project Team
Afnan Ammar MD, Ola Al-Sous MD, Dario Manley, MD
Background/Context
We are a community-based Internal Medicine Residency program. We have 49 categorical residents (PGY1-3), 12 Transitional year interns and 3 chief residents. With the development of the QIPS curriculum, we are addressing these ACGME requirements:
Educational Need
As of July 20, 2019 only 16% of our residents were engaged in a formal QI activity. All stakeholders of the residency program, GME leaders as well as residents, identified the lack of and need for, a formal educational curriculum in QIPS as an essential educational need.
Project Goal
The project goal is to design and implement a formal educational curriculum surrounding QIPS by June,30,2022 resulting in at least 70% of internal medicine residents having completed a QIPS activity before they graduate from residency.
Learning Objectives
After completing this curriculum, internal medicine residents should be able to:
Educational Design
The curriculum will take a “mixed methods” approach to learning and offer a variety of training formats and opportunities including an introductory lecture, additional didactic lectures, flipped classroom sessions using the IHI Open School QI Modules as context for scheduled didactic lectures, an extended 2 hour workshop where residents will apply QIPS techniques, and lastly design and completion of an experiential learning project whereby resident teams will continue to work on their QIPS projects in the clinical setting.
Program Evaluation Measures
The following outcome and process measures will be collected:
Outcome measures
# of residents engaged in QIPS projects
# of projects with improved patient outcomes
Process measures
# of QIPS projects aligning with organization
# of QIPS presentations or publications
# of residents attending each workshop
Learner survey after each workshop session
Residents’ overall evaluation survey at the end of program
Learner Assessment
We plan to use a variety of learner assessments including a pre- and post self-assessment, of resident behaviors, skills and attitudes; a pre and post knowledge test and a formative assessment (may be completed by resident mentor).
Barriers
We have identified the following barriers:
Project Status
Our current status is:
Concluding Remarks
To conclude, we have recognized that a lack of basic knowledge and previous training/experience is felt to be the biggest barrier to residents engagement in QI. We attribute the success of our educational course to two main elements: project based learning as well as having faculty mentors with background in QI. We also recognized that self-directed learning might be challenging when it comes to QI given the many competing educational needs during residency. Pending the results of the post course evaluation, we plan on offering this course in its current format next year and will continue to monitor the long term impact of this course on patients outcome.
Project #2: Contributing to a Culture of Quality and Patient Safety: Educating and Engaging Residents in the Reporting of Adverse Events
Institution
Beaumont Health
Team
Arthur L. Riba, MD, FACC
Introduction
The reporting of adverse events by Graduate Medical Education (GME) trainees is foundational for learning and applying patient safety improvement in the clinical setting. Furthermore, engaging residents in reporting adverse events they observe and experience contributes to and validates a culture of safety and aligns with the mission of the sponsoring healthcare system to ensure high quality and safe care. Residents at the frontlines of patient care are in a unique position to identify safety hazards for the organization
Needs Assessment
It is well documented that GME trainees rarely report adverse events resulting from or contributed to by medical care, and the reasons for this and barriers have been well summarized in the literature. In addition, residents rarely receive feedback about the reports they submit and almost never are included in site investigations or disclosures to patients and their families - despite the Accreditation Council for Graduate Medical Education (ACGME) requirements and the recommendations arising out of the Clinical Learning Environment Review Pathways and the Association of American Medical Colleges (AAMC) Quality Improvement and Patient Safety (QIPS) Competencies.
A major gap exists in what is expected of resident training in QIPS principles and practices and what actually occurs in real time. As seen nationally, the volume of safety event reports submitted by residents and fellows in our system’s central reporting system is relatively low (partly underestimated because many are submitted anonymously) and residents rarely receive meaningful and relevant feedback.
The ACGME CLER Pathways to Excellence provides a framework for training sites to use in their continuing efforts to prepare the clinical care teams to deliver consistently safe, high-quality patient care. Further, the AAMC QIPS Competencies derived from the New and Emerging Areas in Medicine Series define interdisciplinary competencies for QIPS across the continuum of physician education and inform curriculum development in patient safety.
Training programs must also provide residents and fellows with experiential training with their faculty members (e.g., simulated or actual patient care experience) in the clinical site’s process for disclosing patient safety events to patients and families and ensure that residents and fellows are involved with faculty members in disclosing patient safety events to patients and families at the clinical site.
Project Goals
The overarching goals are to:
Learning Objectives and Competency-based Performance Expectations
At the conclusion of this educational and training program, learners will be able to:
Educational Design, Strategies and Methods
We have developed a curriculum using mixed educational methods and formats (both virtual online and live) to enhance initial learning and retention. These include:
Evaluation and Feedback
We are using a variety of learner and program assessment methods:
Program Evaluation
Our initial evaluations of the program revealed that many of our core learning objectives are being met. We are in process to collect additional program assessment data via standardized surveys of residents, review of RCA metrics and monthly control charts, pre and post knowledge tests, etc. This is a complex program to evaluate and will take time and experience.
Concluding Remarks
We need to make it easier, relevant, understandable and effective for residents to detect, identify and report adverse events resulting from or contributed by medical care, motivate organizational learning around patient safety and for trainees to actively participate in strategies to prevent patient harm. Knowledge of common safety hazards, errors, and adverse events that are likely to occur and potential candidate “triggers” that may signal that an adverse event has occurred are foundational for increased awareness, understanding and reporting. We will not achieve highly reliable, safe care and prevention of harm until fully engaged front line staff are empowered with the understanding and rationale for adverse event reporting, knowledge on why, what and how to report, be given opportunities to learn from these events and analyze risks and vulnerabilities and participate directly in system solutions to preventing recurrence. This systematic, comprehensive, experiential competency educational approach attempts to achieve these goals and deliverables. To address these, we will narrow the scope of the project and target training programs that are committed and enthusiastic to participate, have strong leadership support, protected time granted, and strategic resources are readily available, supported, and maintained.
Project #3: Enhancing Quality Improvement and Patient Safety (QIPS) Curriculum for a community-based Transitional Year Residency Program.
Institution
Wellstar Health System Kennestone Hospital, Marietta, GA
Project Team
Joanne Zhu, MD
Background/Context
We are a community based Transitional Year Residency program at the largest healthcare system in the state of Georgia. Our program has 20 residents per year pursuing 5-6 different specialties. For the initial several years, we have shared QIPS curriculum with our Internal Medicine Residency Program. Two years ago, I started a TY specific QIPS curriculum which is separate from that of IM Residency Program, given increased residents compliment for both programs.
Educational Need
Based on the QIPS competencies introduced by Te4Q, the following additional educational items need to be added to the existing curriculum:
Project Goal
Knowing that the original version of the curriculum met ACGME expectations, the program sets out the following specific goals:
Learning Objectives
After completing this curriculum, Transitional Year residents should be able to:
Educational Design
The design is intended to teach QIPS and allow residents to take real action in hospital- based QIPS initiatives during one year period of time. The curriculum will start during new intern orientation week. The exercises and actual work will span the entire intern year.
Program Evaluation Measures
We plan to use the rest of this academic year to develop an evaluation tool for the program and implement it in June 2022. We expect this tool to be effective in assessing our QI curriculum in the following several areas:
Learner Assessment
We plan to develop a learner's assessment tool to assess learners’ individual knowledge and skill levels, as well as their respective needs in QIPS at the beginning and the end of each academic year. "Criteria for Effective Disclosure" and QIKAT will be used during the curriculum or afterwards.
Implementation/Status
We will implement this during the next QI curriculum in June 2022 and to engage one or two quality officers from our hospital. We are also in the process of identifying a patient educator. We do not anticipate any difficulties in getting buy-in. Our hospital has a robust QIPS program. It supports all efforts made by GME in QIPS.
Barriers/Challenges
Two barriers have been identified: We only have limited time to have all 20 residents available to receive the QI curriculum. Otherwise, the curriculum may have to be repeated two or three times so that all residents would have received the education. Also, we do not have a way to collect performance data on individual residents. It will need to be group metric performance reporting.
Back to CE News