What to Know About Competency-Based Training

A new data-based model ensures emergency medicine residents have mastered the everyday physician activities that are the foundation of clinical practice.

Holly Caretta-Weyer, MD, MHPE, and a collaborative consortium of leaders within emergency medicine have utilized a five-year $1.25 million grant from the American Medical Association (AMA) to develop an ecosystem of assessment and predictive learning analytics that move the specialty toward competency-based medical education.

An entrustable professional activity (EPA) is an “everyday physician activity,” ranging from developing a differential diagnosis to resuscitating a victim of multisystem trauma. Caretta-Weyer and the team have identified 22 EPAs that serve as the basis to provide residents with specific, real-time feedback on their performance. Data from the feedback is also used to detect trends by comparing a resident’s results to those at similar points in their training.

As a result, knowledge gaps are identified in less than half the time and residents can course correct much more quickly, with coaches guiding them in the learning process.

Here’s how it works:

At the end of a shift, faculty attendings open a dedicated app developed by partners at the University of Michigan. After selecting the observed resident and EPA, faculty enter feedback on what the resident did well and at least one area for improvement. The feedback is typically short — just a sentence or two — but it is direct, specific, and timely, and faculty are coached on how to provide constructive suggestions.

According to Caretta-Weyer, a clinical associate professor of emergency medicine, the amount of feedback for emergency medicine residents at Stanford has increased tenfold from 300 data points per year to more than 3,000.

While residents can view feedback almost immediately after their shift, they can also access a personalized dashboard showing their performance trends over time. At Stanford, a competency committee evaluates the data for each resident and an assigned faculty coach meets quarterly with residents to review the dashboard and map out a learning plan to address knowledge gaps and professional development opportunities.

The project has been implemented at six test sites around the country, including Stanford. Caretta-Weyer and the team are now translating this work into national standards for emergency medicine residency on several fronts:

A task force led by the Council of Residency Directors in Emergency Medicine (CORD) is examining the results of the test sites with an eye toward operationalizing the results in an adaptable way across all emergency medicine programs.

The American Board of Emergency Medicine (ABEM) recently convened representatives from SAEM, CORD, and the AMA to explore adopting the EPAs as standards of measure across the specialty and integrating them into Board eligibility.

In 2025, the SAEM Consensus Conference will focus on competency-based education, including the use of EPAs.

While Board certification has long been the litmus test for emergency medicine training, the pass/fail feedback comes post-residency and the multiple-choice questions may not always align with optimal learning and skills acquisition. In contrast, EPA-based competency training allows for constantly adjusted learning while still in residency, ensuring mastery over required skills far in advance of the Board exams.

The app and feedback dashboard could be used by programs of all sizes and budgets. Caretta-Weyer and the team are developing guidance on how to scale other model components such as competency committee processes and coaching paradigms based on available program resources.

 

Spring 2024