• Home
  • Summer 2022 Feature Articles

SACME Research Team Completes CE Educator’s Toolkit

An important and unique educational resource is presented at the ACCME annual conference in May

Reported by Robert D’Antuono, MHA

The CE Educator’s Toolkit, a unique educational resource, was developed by the Society for Academic Continuing Medical Education (SACME) through an Accreditation Council for Continuing Medical Education (ACCME) research grant in fulfillment of ACCME’s strategic goal to advocate for research and scholarship in continuing education. The toolkit is a unique and comprehensive resource designed to equip educators with best practices and guidelines to deliver effective continuing education (CE). The toolkit is now available and may be downloaded from the ACCME website or by clicking on the link here.

Research Project Team & Authors

We are grateful to the following SACME members for participating on the research team and for contributing their time and talents in creating the toolkit:

  • Betsy Williams, PhD, MPH
  • David Wiljer, PhD
  • Joyce Fried, FSACME
  • Gabrielle Kane, MB, EdD, FRCPC
  • Sharon Ambata-Villanueva, MA, CTDP
  • Ashleigh Jaggars, MPH
  • Tharshini Jeyakumar, MHI
  • Inaara Karsan, MHI
  • Morag Paton, PhD, MEd
  • Nathaniel Williams
  • Sarah Younus, MPH

Overview

Clinical knowledge is continuously shifting the scope of practice and thus, significant gaps persist in translating evidence into practice to provide compassionate and high-quality care for patients. Current CE interventions are focused on instructor-oriented approaches and have been shown to be ineffective in fostering practice change. Inadequate education efforts can thwart the cultivation of knowledge and skills in ensuring practice decisions are rooted in evidence and aligned with current clinical standards. Accredited CE and faculty development are critical factors to promote a lifelong learning mindset of behavior change and practice improvement.

CE interventions for healthcare professionals are more effective when instructional methods promote critical thinking, collaboration, and decision-making skills. These methods improve team performance and encourage behavior change among healthcare professionals. In the absence of these methods, CE can be ineffective and feel unrewarding to learners. Therefore, educators should seek to evolve their educational programs to integrate principles of active learning and activities that promote engagement to make it more appealing for today's healthcare professional learners.

This toolkit aims to provide CE leaders, educators, and healthcare professionals with best practices and guidelines to assist in the design and delivery of CE in a manner that fosters a practical and active learning approach. Although there are different approaches to CE interventions, this toolkit will focus on three key interventions based on evidence-driven investigation and consultation with medical education experts: (1) facilitation of small group learning, (2) case-based learning, and (3) reflective learning.

The first section of the toolkit provides information about how the toolkit was developed and allows readers to complete a self-assessment to identify areas of the toolkit that might be relevant to their CE practice. Section 2 of the toolkit provides a matrix of essential components to consider when planning a CE intervention/activity. The subsequent section explores how to prepare and implement the three interventions selected for the toolkit: (1) facilitation of small group learning, (2) case-based learning, and (3) reflective learning. Section 4 in the toolkit is devoted to assessing and evaluating CE interventions with suggestions for various methods of evaluation. These essentials are discussed in extraordinary detail with instructional concepts and methods to plan, organize and implement each phase. To illustrate, Section 2 discusses:

I. First steps to education intervention design

  • Structure a longitudinal and multimodal education intervention
  • Use the PDSA cycle to structure your CE intervention
  • Conduct a needs assessment to understand the audience and their learning needs

II. Learning objectives

  • Develop targeted learning objectives
  • How to write a learning objective
  • How to make a learning objective actionable

III. IDEA principles

  • Designing for inclusivity
  • Designing for diversity
  • Designing for equity
  • Designing for accessibility
  • Checklist

IV. Virtual delivery

  • Key considerations for virtual delivery

The toolkit was created for the novice and intermediate experience levels in CE development. However, this toolkit provides additional resources, planning templates, and helpful tips for those with advanced expertise in developing CE activities.

Toolkit Use, Distribution and Suggested Citation

The toolkit may be used, distributed, or presented for non-promotional educational purposes with attribution:

Accreditation Council for Continuing Medical Education. 2022. CE Educator’s Toolkit: Evidence-based design and implementation strategies for effective continuing education. http://www.accme.org/ceeducatorstoolkit

Download your copy of the Toolkit today!

  • Download the CE Educator's Toolkit here.
  • Download the worksheet guide for the toolkit here. This guide aims to complement the toolkit by providing the worksheets and templates in an accessible format.

Back to CE News

Expanding MOC/CC to Your RSS Activities

By Sahar Pastel-Daneshgar and Richard Wiggins, MD

This article discusses an academic experience, tips and tricks for expanding or getting started with MOC/CC Part 2 credits especially into RSS.

About us. The University of Utah Continuing Medical Education (UUCME) Office is part of the University of Utah School of Medicine. UUCME has Accreditation with Commendation from the ACCME and are ABMS Portfolio Program Sponsors. The UUCME is a decentralized office, meaning we work with planning partners to provide CME credit. We do not provide conference or planning services. Currently, we accredit 150+ RSSs, 100 courses, 100+ enduring materials, and 3-7 PI-CME each year. Since all our physicians must be board-certified, helping our learners maintain/continue their board certifications is important to our office.

MOC/CC Program. We communicate often that all board-certified providers without lifetime certification (or lifetime certified providers that have opted into MOC/CC) are required to participate for MOC/CC to ‘maintain’ or ‘continue’ certification. We find there is often confusion about MOC/CC and try to help clarify for our providers. For more information about ABMS and certification, see https://www.abms.org/board-certification/

We provide MOC/CC Part 2 credit for all boards aligned with ACCME. Each of the boards has additional specific requirements for what their diplomates must do. The ACCME produced a helpful Program Guide that details each of the boards’ requirements, found here: https://www.accme.org/publications/cme-for-moc-program-guide We reference this document often! Additional resources from ACCME are also helpful, here: https://www.accme.org/cme-counts-for-moc

UUCME offers many options for MOC/CC Part 2 through our already approved CME activities. We’ve had great success integrating MOC/CC Part 2 into our Regularly Scheduled Series (RSS), in particular. We have effectively increased the number of RSSs at our institution with MOC/CC Part 2 from 19 (with 2,876 credits) in 2019, to 34 (with 5,124 credits) in 2021.

Lessons for CPD Practice

1. Communication. Help providers understand MOC/CC, and that it is required of board-certified providers without lifetime certification. They are already doing most of the work for CME, so why not get those cookies?

We started by identifying activities that were already including the required MOC/CC components in their activity with engagement strategies like case discussion. We wanted to give them credit for the work they were already doing (get those cookies!). Since the self-evaluation engagement piece is already occurring, adding MOC/CC only meant adding a statement and a tracking mechanism. The educational design components, often the most difficult for us to change, were already in place.

We’ve found people are often intimidated by MOC/CC and by talking it through, outside of email or documents, helps show how straightforward it can be. And, where the complexities might lie! We’re also able to chat about what people might already be doing that we can tweak to count for MOC/CC Part 2. Once we were able to demonstrate that we could make it easy for planners and diplomates, with just a few additions into what they are already doing, we started to get planners who wanted to include the credit.

2. Ease of use. Use clear templates to reduce work for series and your office. Connect the value of MOC/CC credit to work they are already doing.

After we started with our initial series, we increased our efforts to reach out to various departments that may be interested in adding MOC/CC to their RSS events. We started by surveying all RSS directors of their interest in adding MOC/CC Part 2 into their series with our annual evaluation process. Then we asked the learners. Finally, we followed up with the interested activities. Recently, we’ve begun reaching out one-on-one to departments directly, explaining MOC/CC and why it might be beneficial for planners to include MOC/CC Part 2 in their activities. We targeted specific departments based on the number of diplomates within our system as well as the ease of implementing MOC/CC. We have presented at department faculty meetings using a templated presentation about the importance of MOC/CC and how it can easily be added to current CME-approved RSS events.

Some Boards such as ABA or ABPath, can easily be added to an RSS that is already meeting the requirements for CME credit. We also have many interdisciplinary RSS with multiple areas of care covered in the same activity, sharing expertise across specialties (e.g., surgery and internist or pediatricians and anesthesiologists). If the learners of an activity cross boards, we offer credit for all the applicable boards. Multiple boards can be accredited and submitted to for a single activity and encourage multidisciplinary approaches for improved learning opportunities.

The element of all the board’s requirements that tends to trip up planners the most is providing feedback back to learners from their engagement. We’ve found case discussion, which most of our activities were already including, to be the easiest mechanism for both assessment and feedback, where the feedback is the on-going conversation. Zoom chats can also work, as long as someone is providing feedback during the session. We’ve experimented with polling functions but found our planners soon lost track of writing questions for weekly sessions. With the size of our office and the small number on our team added to our busy planners’ schedules, we couldn’t include things that would require additional work outside of the RSS sessions, like requesting planners to review reflective statements and then provide individual feedback. We were concerned more intense strategies like that wouldn’t get completed in the busy day-to-day of our system. Our strategy has been to focus on what activities are already doing for engagement and feedback of their learners.

3. Follow-up. Follow up on adding MOC/CC, especially when it first starts until there is a good rhythm, use follow up to help clarify any issues, questions, or concerns, and to build a relationship with that series.

Once you’ve added MOC/CC Part 2 into a series, follow-up is needed to ensure it takes hold. Just like for CPD providers, MOC/CC can seem daunting at first for the RSS coordinators and directors we work with. But, it will work; and it will become routine. So, we follow-up to be sure they understand what’s needed, check-in often in the beginning, and work to make it as easy as possible for the planners to integrate the additional requirements into their activities. Typically, those additional requirements are adding statements, gaining permission from diplomates and tracking participation in the MOC/CC engagement component.

Wherever possible, we’ve developed quick, easy, and reproducible templates to share across RSS for integration. We’ve created an easy online survey for diplomates to provide their permission and board information. Adding statements has become part of our CME statement block and is checked within our usual CME process.

For tracking completion of participation, we have tried several methods. We had a few series try to do poll engagement with survey questions for each session that learners would complete, and we’d use the read out from that poll as tracking. We have had series track participation via zoom chats, which has worked as long as the presenter could comment to give feedback to learners. By far, though, what we have found most successful is providing a simple spreadsheet for CME coordinators to check off during their session as diplomate learners complete the engagement requirements. The coordinators then send us their “pass tracking” Excel file and we confirm credit has been awarded. This simple approach, completed digitally and emailed, has been the most manageable idea we’ve come up with so far.

For large functions we keep in mind that the number of diplomates is typically much smaller than the full number of participants. For some of our larger in-person courses, that also include MOC, diplomates are asked to sit in one section of the room as a group so ‘tracking’ is easier. We’ve also allowed for self-reported tracking to the CME coordinator, whereby a diplomate who engages during the session checks in with the coordinator on the way out of the learning space and confirms that they did engage in the discussion.

Once a series transitioned to virtual or hybrid formats, we reviewed how to continue offering MOC to series with a large number of attendees at each session. For these RSSs, the speaker is instructed to ask a case-based question related to the presentation and learners respond to each question via the virtual chat feature. The speaker provides a brief moment for learners to respond after the question is posed, and then the speaker will immediately give feedback as to the correct answer and why. The learner's answer(s)--right or wrong--satisfies the MOC requirement for learner self-evaluation. For this method all physician learners must have their name visible in their profile so they can be matched to their individual response. The entire session chat is saved as the "pass" tracking mechanism by the series coordinator or moderator running the virtual RSS. In turn, the chat is sent to the CME office.

To keep things as easy as possible, we’ve developed several templates to guide us and keep us consistent. Our most used templates include:

  • Email template for reaching out to offer MOC/CC to RSS
  • MOC/CC permission collecting survey
  • MOC/CC passing tracking
  • Follow-up email template
  • Organizational MOC/CC Sheet

These templates can be found here:

https://drive.google.com/drive/folders/1DjQ4RicDpdpl3FSz_SBWkbxQPMwjI9aW?usp=sharing

Overall, we have found the experience of adding MOC/CC Part 2 into RSS to be beneficial for our providers, helpful for our planners, and even a little bit fun! If you are new to the MOC process, our best piece of advice is to just try it. Pick one board and one activity and see how it goes. Then refine and try again. Sound familiar? Plan-do-study-act – just be sure to get to that Do. And share with us what you learn along the way.

Sahar Pastel-Daneshgar is MOC/CC Coordinator and Richard Wiggins, MD is Associate Dean at the University of Utah School of Medicine, Continuing Medical Education Office.

Back to CE News

AAMC Virtual Program on Teaching Quality Improvement and Patient Safety (QIPS) Competencies to Faculty

Reported by Robert D’Antuono, MHA

Last fall 2021, the Association for American Medical Colleges (AAMC) hosted a seven week longitudinal and virtual curriculum to train clinical faculty and other medical educators on the QIPS competencies so that they may teach quality improvement and patient safety skills to their learners (ie other faculty, residents and students). The curriculum for this program was designed by a national panel of expert medical educators convened by the AAMC over a two year period. Entitled Teaching Education for Quality (Te4Q), the panel worked to revise and update the original Te4Q curriculum to incorporate the AAMC’s Quality Improvement and Patient Safety (QIPS) Competencies for teaching faculty as well as to restructure the series for distribution using a virtual learning platform. These competencies present a roadmap for medical education curricular and professional development, performance assessment, and improvement of health care. The full report is available at https://www.aamc.org/data-reports/report/qipscompetencies.

Three outstanding projects completed by participants are presented here for your review. SACME members may wish to bring these projects to their respective department activity directors to initiate a discussion on creating similar initiatives to advance faculty learning in the AAMC QIPS competencies at their own institutions.


QIPS Teaching Project Examples

Project #1: Integrating a Quality Improvement and Patient Safety (QIPS) Curriculum into an Internal Medicine Residency

Institution

Saint Joseph Mercy Hospital-Ann Arbor, Michigan

Project Team

Afnan Ammar MD, Ola Al-Sous MD, Dario Manley, MD

Background/Context

We are a community-based Internal Medicine Residency program. We have 49 categorical residents (PGY1-3), 12 Transitional year interns and 3 chief residents. With the development of the QIPS curriculum, we are addressing these ACGME requirements:

  • Residents must receive training and experience in quality improvement processes, including an understanding of health care disparities. (Core)
  • Experiential learning is essential to developing the ability to identify and institute sustainable systems-based changes to improve patient care

Educational Need

As of July 20, 2019 only 16% of our residents were engaged in a formal QI activity. All stakeholders of the residency program, GME leaders as well as residents, identified the lack of and need for, a formal educational curriculum in QIPS as an essential educational need.

Project Goal

The project goal is to design and implement a formal educational curriculum surrounding QIPS by June,30,2022 resulting in at least 70% of internal medicine residents having completed a QIPS activity before they graduate from residency.

Learning Objectives

After completing this curriculum, internal medicine residents should be able to:

  • Describe their individual role as clinicians within our healthcare system (system thinking)
  • Identify and define the scope of a QIPS problem
  • Build and effectively work with a multidisciplinary improvement team
  • Solve a problem using the MFI
  • Create a SMART goal
  • Establish a family of measures (outcome, process and balancing measures)
  • Map a process using flowchart or block diagram
  • Apply different tools to determine the root causes (use at least one cause-effect diagram, 5 whys and driver diagram)
  • Use at least one tool for selecting a change (Pareto Chart or Impact-Effort matrix)
  • Create a data collection plan
  • Test a change on a small scale using at least one PDSA cycle
  • Discuss and analyze data variation
  • Create and interpret a Run chart
  • Complete an A3 or project charter
  • Present the project locally and/or nationally

Educational Design

The curriculum will take a “mixed methods” approach to learning and offer a variety of training formats and opportunities including an introductory lecture, additional didactic lectures, flipped classroom sessions using the IHI Open School QI Modules as context for scheduled didactic lectures, an extended 2 hour workshop where residents will apply QIPS techniques, and lastly design and completion of an experiential learning project whereby resident teams will continue to work on their QIPS projects in the clinical setting.


Program Evaluation Measures

The following outcome and process measures will be collected:

Outcome measures

# of residents engaged in QIPS projects

# of projects with improved patient outcomes

Process measures

# of QIPS projects aligning with organization

# of QIPS presentations or publications

# of residents attending each workshop

Learner survey after each workshop session

Residents’ overall evaluation survey at the end of program

Learner Assessment

We plan to use a variety of learner assessments including a pre- and post self-assessment, of resident behaviors, skills and attitudes; a pre and post knowledge test and a formative assessment (may be completed by resident mentor).

Barriers

We have identified the following barriers:

  • COVID and its impact on in-person learning experience
  • Staff shortages due to COVID
  • Other concurrent educational needs for residents
  • The impact of transitioning to a new EMR on data reporting

Project Status

Our current status is:

  • Pre-course self-assessment as well as knowledge test shows that there is a significant knowledge gap surrounding the fundamentals of QI
  • The educational course has been delivered as outlined in the program timeline. Some of the activities were delivered virtually
  • The program was perceived well by the residents with many positive feedback most in favor of in-person learning
  • As of now we have:
    • 21 QI projects(compared to 5 prior),12 are inpatient (compared to 1)
    • 27 residents engaged in QI activities (compared to 9)
    • 3 projects have already been accepted at national meetings
    • 4 projects have demonstrated improved patient outcomes
  • Many of these projects have been overseen by 2 faculty from in-patient and outpatient sitting. These faculty have background in QI and are able to provide mentorship and direct feedback in real life

Concluding Remarks

To conclude, we have recognized that a lack of basic knowledge and previous training/experience is felt to be the biggest barrier to residents engagement in QI. We attribute the success of our educational course to two main elements: project based learning as well as having faculty mentors with background in QI. We also recognized that self-directed learning might be challenging when it comes to QI given the many competing educational needs during residency. Pending the results of the post course evaluation, we plan on offering this course in its current format next year and will continue to monitor the long term impact of this course on patients outcome.

Project #2: Contributing to a Culture of Quality and Patient Safety: Educating and Engaging Residents in the Reporting of Adverse Events

Institution

Beaumont Health

Team

Arthur L. Riba, MD, FACC

Introduction

The reporting of adverse events by Graduate Medical Education (GME) trainees is foundational for learning and applying patient safety improvement in the clinical setting. Furthermore, engaging residents in reporting adverse events they observe and experience contributes to and validates a culture of safety and aligns with the mission of the sponsoring healthcare system to ensure high quality and safe care. Residents at the frontlines of patient care are in a unique position to identify safety hazards for the organization

Needs Assessment

It is well documented that GME trainees rarely report adverse events resulting from or contributed to by medical care, and the reasons for this and barriers have been well summarized in the literature. In addition, residents rarely receive feedback about the reports they submit and almost never are included in site investigations or disclosures to patients and their families - despite the Accreditation Council for Graduate Medical Education (ACGME) requirements and the recommendations arising out of the Clinical Learning Environment Review Pathways and the Association of American Medical Colleges (AAMC) Quality Improvement and Patient Safety (QIPS) Competencies.

A major gap exists in what is expected of resident training in QIPS principles and practices and what actually occurs in real time. As seen nationally, the volume of safety event reports submitted by residents and fellows in our system’s central reporting system is relatively low (partly underestimated because many are submitted anonymously) and residents rarely receive meaningful and relevant feedback.

The ACGME CLER Pathways to Excellence provides a framework for training sites to use in their continuing efforts to prepare the clinical care teams to deliver consistently safe, high-quality patient care. Further, the AAMC QIPS Competencies derived from the New and Emerging Areas in Medicine Series define interdisciplinary competencies for QIPS across the continuum of physician education and inform curriculum development in patient safety.

Training programs must also provide residents and fellows with experiential training with their faculty members (e.g., simulated or actual patient care experience) in the clinical site’s process for disclosing patient safety events to patients and families and ensure that residents and fellows are involved with faculty members in disclosing patient safety events to patients and families at the clinical site.

Project Goals

The overarching goals are to:

  • Motivate, encourage, expect, and facilitate residents to report adverse events and near misses
  • Provide insight to residents on the rationale for event reporting
  • Increase awareness of the types, themes, and categories of adverse events, medical errors, and near misses and how to identify in the clinical care environment when resulting from applied medical care
  • Identify “triggers” that suggest that an adverse event might have resulted from care
  • Demonstrate how to report adverse events in the central reporting system, incorporating self-identification when reporting to ensure feedback, and validate a culture of safety
  • Describe value and means for residents and fellows to engage actively in experiential learning by participating in adverse event investigations at respective clinical training sites and contribute to action plans to prevent recurrence and reduce harm related to healthcare

Learning Objectives and Competency-based Performance Expectations

At the conclusion of this educational and training program, learners will be able to:

  • Discuss the rationale of reporting adverse events and be knowledgeable of the common types of events and medical errors to report and potential “triggers” in the clinical setting that alert to the potential that an adverse event occurred.
  • Demonstrate the knowledge and skill of a practice-specific protocol on how to report safety events and hazards in the sites central event reporting system without fear of retaliation, intimidation or shame; and to expect specific feedback on their reported events, the action plans implemented and the impact of their reporting on patient safety improvement within the healthcare system.
  • Realize the incremental value that residents provide by their participation in Root Cause Analysis (RCA) activities and their suggestions and recommendations regarding risk reduction strategies and solutions to prevent recurrence, reduce harm and result in system-based improvement.
  • Reinforce a culture of safety by their role-modeling example of self-identification when reporting adverse events, providing recommendations for safety improvement, and promoting feedback to the adverse events reported.
  • Participate in site-specific investigation teams and analysis of patient safety events and RCAs to target improvement efforts and risk reduction and error-prevention strategies.
  • Discuss the types and themes of adverse events, near misses, and triggers. How to report patient safety events proactively and set an example on self- identification when reporting and motivate that feedback be provided
  • Distinguish which reported events result in patient harm, require RCAs, and lead to effective and sustainable improvement and impact on the healthcare’s patient safety mission
  • Evaluate the impact of the reports submitted by residents and acted upon and effectiveness and sustainability of action plans to prevent recurrence and lessons learned

Educational Design, Strategies and Methods

We have developed a curriculum using mixed educational methods and formats (both virtual online and live) to enhance initial learning and retention. These include:

  • Didactic slide presentation for all training programs (available for either in-person or virtual presentation) : “Why, when, how and what to report”; promoting self-identification when reporting and expectation of feedback and invitation to site adverse event investigation teams.
  • Demonstration online video on how to to report in our RCA reporting system;
  • Active process to promote residents participation in RCAs, both during and in follow-up;
  • Discussion groups, case-based examples of adverse events, and video examples of role play and simulation, will be included in the didactic presentations or available on demand for self directed learning.
  • Breakout sessions of residents sharing their experiences when they were involved in an adverse event/medical error and their experience in reporting events; F/U report out to group and discussion.
  • Pocket card of themes of adverse events and triggers to be distributed to residents and fellows and expectations of trainees when reporting
  • Each training Program Director will be asked to assign a “Resident Champion” to encourage reporting by colleagues and summarizing each month the RCA's reported by respective residents and ensure that they received feedback.
  • The RCA’s and their themes reported by residents will continue to be provided to each training Program Director respectively on a monthly basis. The control chart of the volume of RCA’s reported by residents, themes, % anonymous, and near misses will be presented during the system-wide Patient Care Improvement Committee (now a standing agenda item).
  • All patient safety nurses/RCA organizers have committed as a quality standard to inviting residents to site RCAs based on relevance to respective training program; invites are sent to the Program Directors to encourage and support participation one week before the RCA.
  • It will be strongly encouraged to each Service Faculty Attending to meet with their rounding clinical team once a week during signout/transition of care handoff to/review adverse events identified during the week.
  • A video of a “safe room” review of simulated adverse events will be shown during the didactic presentation. Creation of an “Escape room” – to identify and report simulated examples of adverse events in a safe place without fear of retaliation, blame or embarrassment.
  • Senior residents will be trained on potential “Triggers” that may signal that an adverse event occurred and review with the rounding teams at the start of the monthly in-patient rotations to incorporate into clinical rounds
  • Patients will be scripted by residents to speak up on rounds regarding perceptions of adverse events, medication errors and provision of care breakdowns
  • Event reporting simulation/role modeling available on demand to programs requesting additional training; small group discussions and breakout sessions with video review and simulation to be included.
  • Develop, implement and incorporate patient safety training into the clinical learning environment: encourage resident participation on interprofessional unit-based safety huddle/rounds to contribute to and partner with identification, review and reporting of adverse events; advocate unit-based Kaizens to promote and improve interprofessional event reporting; and unit based safety huddles to review adverse events identified.


Evaluation and Feedback

We are using a variety of learner and program assessment methods:

    • Miller’s Pyramid will be used to evaluate overall competency-- knows, know how, shows how, does -- via testing and direct observation at the site of care
    • Pre- and post-tests to assess learner knowledge of types of adverse events, identification of adverse events on rounds to report; acknowledgment of impact of culture, psychological safety and speaking up on event reporting
    • Surveys of 1) satisfaction with the curriculum; 2) teaching and experiential learning; 3) competency/skill and confidence to report and identify oneself; 4) receiving feedback of event reported and being invited to participate in RCA
    • Demonstration of identification of an adverse event on rounds, response and how to report (actual, simulated with video feedback vs role-play)
    • Resident participation on RCAs now being tracked
    • Feedback by patient safety nurses and data analytics regarding the PSI impact of reports submitted by residents
    • Monthly control chart of resident RCA reporting
    • Resident-sensitive patient safety measures – impact on reduction of adverse events and prevention of recurrence based on events reported by residents
    • Learner assessment of the following by testing, survey, and direct observation:
    • Defines and differentiates adverse events and near misses
    • Demonstrates knowledge of practice-specific protocol for reporting safety events
    • Describes common types of human error
    • Describes preventable adverse events in the just-culture framework and differentiates events that could be remedied with coaching and events that are due to system problems. Identifies system factors and designs that contribute to safe environments of care
    • Identifies opportunities to engage patients and families in improving safety at both the individual and organizational levels and elicits information from patients and families to identify patient safety hazards or barriers to effective care delivery.

Program Evaluation

Our initial evaluations of the program revealed that many of our core learning objectives are being met. We are in process to collect additional program assessment data via standardized surveys of residents, review of RCA metrics and monthly control charts, pre and post knowledge tests, etc. This is a complex program to evaluate and will take time and experience.

Concluding Remarks

We need to make it easier, relevant, understandable and effective for residents to detect, identify and report adverse events resulting from or contributed by medical care, motivate organizational learning around patient safety and for trainees to actively participate in strategies to prevent patient harm. Knowledge of common safety hazards, errors, and adverse events that are likely to occur and potential candidate “triggers” that may signal that an adverse event has occurred are foundational for increased awareness, understanding and reporting. We will not achieve highly reliable, safe care and prevention of harm until fully engaged front line staff are empowered with the understanding and rationale for adverse event reporting, knowledge on why, what and how to report, be given opportunities to learn from these events and analyze risks and vulnerabilities and participate directly in system solutions to preventing recurrence. This systematic, comprehensive, experiential competency educational approach attempts to achieve these goals and deliverables. To address these, we will narrow the scope of the project and target training programs that are committed and enthusiastic to participate, have strong leadership support, protected time granted, and strategic resources are readily available, supported, and maintained.

Project #3: Enhancing Quality Improvement and Patient Safety (QIPS) Curriculum for a community-based Transitional Year Residency Program.

Institution

Wellstar Health System Kennestone Hospital, Marietta, GA

Project Team

Joanne Zhu, MD

Background/Context

We are a community based Transitional Year Residency program at the largest healthcare system in the state of Georgia. Our program has 20 residents per year pursuing 5-6 different specialties. For the initial several years, we have shared QIPS curriculum with our Internal Medicine Residency Program. Two years ago, I started a TY specific QIPS curriculum which is separate from that of IM Residency Program, given increased residents compliment for both programs.

Educational Need

Based on the QIPS competencies introduced by Te4Q, the following additional educational items need to be added to the existing curriculum:

  • Incorporate social determinants of health into our current QIPS curriculum (gap).
  • High value care with intention to improve time and cost efficiency of the care (competency).
  • Emphasize effective teamwork and interprofessional communication (competency).
  • Patient perspective in context of patient safety (competency).

Project Goal

Knowing that the original version of the curriculum met ACGME expectations, the program sets out the following specific goals:

  • Further enhance the curriculum’s educational value and efficiency over time.
  • Customize the curriculum for a one-year duration.
  • Allow 100% participation by Transitional Year residents.

Learning Objectives

After completing this curriculum, Transitional Year residents should be able to:

  • Describe three examples of challenges we face in care delivery within US Healthcare System.
  • Utilize fishbone diagram and flow chart.
  • Conduct PDSA cycles and data analysis.
  • Identify three common medical errors.
  • Distinguish near misses, adverse events, sentinel events.
  • Describe the value of a patient safety reporting system.
  • Recognize waste in care delivery and use an evidenced-based approach to reduce such waste.
  • Cite social determinants of health including cost concerns when delivering care.
  • Identify health disparities based on quality measures.
  • Demonstrate proficiency via 360 evaluations in interdisciplinary communication with patients, families, and team members.
  • Engage patients and families for shared decision-making in care delivery.

Educational Design

The design is intended to teach QIPS and allow residents to take real action in hospital- based QIPS initiatives during one year period of time. The curriculum will start during new intern orientation week. The exercises and actual work will span the entire intern year.

  • Didactics on current state of US healthcare delivery, basic concepts and methodology in IHI Model of Improvement. (3 hours)
  • Small group-based simulations including ongoing hospital QI initiatives. Patient Safety scenarios will be added to the simulation portfolio. (3 hours)
  • Pre and Post Learners/Program assessment. (Beginning and end of the academic year.)
  • All residents will be engaged in hospital-based QIPS projects.
  • Group-based (teaching units) metric performance data will be shared throughout the academic year.
  • 100% participation of our TY-specific Disclosure of Adverse Events Workshop (3 hours).
  • HVC may be incorporated in this curriculum, likely via TY-specific Noon conferences.
  • Participation in hospital led RCA meetings throughout the year, minimum one meeting per resident per year.
  • Identify key quality representatives in the hospital to share real world perspective.
  • Create teaching on how to publish quality projects

Program Evaluation Measures

We plan to use the rest of this academic year to develop an evaluation tool for the program and implement it in June 2022. We expect this tool to be effective in assessing our QI curriculum in the following several areas:

  • Effectively raise residents' confidence in conducting QI projects, communicating and analyzing patient safety events.
  • Improve residents' knowledge/skill level in QIPS.
  • Duration of the curriculum is conducive to learning specifically for TY residents.
  • Adequately close the skill and knowledge gap in QIPS for the residents at their respective level.

Learner Assessment

We plan to develop a learner's assessment tool to assess learners’ individual knowledge and skill levels, as well as their respective needs in QIPS at the beginning and the end of each academic year. "Criteria for Effective Disclosure" and QIKAT will be used during the curriculum or afterwards.

Implementation/Status

We will implement this during the next QI curriculum in June 2022 and to engage one or two quality officers from our hospital. We are also in the process of identifying a patient educator. We do not anticipate any difficulties in getting buy-in. Our hospital has a robust QIPS program. It supports all efforts made by GME in QIPS.

Barriers/Challenges

Two barriers have been identified: We only have limited time to have all 20 residents available to receive the QI curriculum. Otherwise, the curriculum may have to be repeated two or three times so that all residents would have received the education. Also, we do not have a way to collect performance data on individual residents. It will need to be group metric performance reporting.

Back to CE News

Current Issue | Summer 2022

Powered by Wild Apricot Membership Software