The transition from undergraduate medical education (UME) to graduate medical education (GME) in the United States needs comprehensive reform, says a new report from the Graduate Medical Education Review Committee (UGRC) of the Coalition for Physician Accountability.
The 275-page report presents preliminary findings that were released last April and a long list of stakeholder comments. According to the report, the Coalition will meet soon to discuss the final recommendations and consider next steps toward implementation.
The UGRC includes representatives of national medical organizations, medical schools, and residency programs. Among the organizations that participated in the report’s creation are the American Medical Association, the National Board of Medical Examiners, the American Osteopathic Association, the National Board of Osteopathic Medical Examiners, the Educational Commission for Foreign Medical Graduates, and the Association of American Medical Colleges.
The report identifies a list of challenges that affect the transition of medical students into residency programs and beyond. They include:
Too much focus on finding and filling residency positions instead of “assuring learner competence and readiness for residency training”
Inattention to assuring congruence between applicant goals and program missions
Overreliance on licensure exam scores rather than “valid, trustworthy measures of students’ competence and clinical abilities”
Increasing financial costs to students
Individual and systemic biases in the UME-GME transition, as well as inequities related to international medical graduates
Seeking a Common Framework for Competence
Overall, the report calls for increased standardization of how students are evaluated in medical school and how residency programs evaluate students. Less reliance should be placed on the numerical scores of the US Medical Licensing Examination (USMLE), the report says, and more attention should be paid to the direct observation of student performance in clinical situations. In addition, the various organizations involved in the UME-GME transition process are asked to work better together.
To develop better methods of evaluating medical students and residents, UME and GME educators should jointly define and implement a common framework and set of competencies to apply to learners across the UME-GME transition, the report suggests.
While emphasizing the need for a broader student assessment framework, the report says, USMLE scores should also continue to be used in judging residency applicants. “Assessment information should be shared in residency applications and a postmatch learner handover. Licensing examinations should be used for their intended purpose to ensure requisite competence.”
Among the committee’s three dozen recommendations are the following:
• The Centers for Medicare and Medicaid Services should change the GME funding structure so that the initial residency period is calculated starting with the second year of postgraduate training. This change would allow residents to reconsider their career choices. Currently, if a resident decides to switch to another program or specialty after beginning training, the hospital may not receive full GME funding, so may be less likely to approve the change.
• Residency programs should improve recruitment practices to increase specialty-specific diversity of residents. Medical educators should also receive additional training regarding antiracism, avoiding bias and ensuring equity.
• The self-reported demographic information of applicants to residency programs should be measured and shared with stakeholders, including the programs and medical schools, to promote equity. “A residency program that finds bias in its selection process could go back in real time to find qualified applicants who may have been missed, potentially improving outcomes,” the report notes.
• An interactive database of GME program and specialty track information should be created and made available to all applicants, medical schools, and residency programs at no cost to applicants. “Applicants and their advisors should be able to sort the information according to demographic and educational features that may significantly impact the likelihood of matching at a program.”
Less Than Half of Applicants get In-depth Reviews
The 2020 NRMP Program Director Survey found that only 49% of applications received in-depth review. In light of this, the report suggests that the application system be updated to use modern information technology, including discrete fields for key data to expedite application reviews.
Many applications have been discarded because of various filters used to block consideration of certain applications. The report suggests that new filters be designed to ensure that each detects meaningful differences among applicants and promotes review based on mission alignment and likelihood of success in a program. Filters should be improved to decrease the likelihood of random exclusions of qualified applicants.
Specialty-specific, just-in-time training for all incoming first-year residents is also suggested to support the transition from the role of student to a physician ready to assume increased responsibility for patient care. In addition, the report urges adequate time be allowed between medical school graduation and residency to enable new residents to relocate and find homes.
The report also calls for a standardized process in the United States for initial licensing of doctors at entrance to residency in order to streamline the process of credentialing for both residency training and continuing practice.
Osteopathic Students’ Dilemma
To promote equitable treatment of applicants regardless of licensure examination requirements, comparable exams with different scales (COMLEX-USA and USMLE) should be reported within the electronic application system in a single field, the report said.
Osteopathic students, who make up 25% of US medical students, must take the COMLEX-USA exam, but residency programs may filter them out if they don’t also take the USMLE exam. Thus, many osteopathic students take both exams, incurring extra time, cost, and stress.
The UGRC recommends creating a combined field in the electronic residency application service that normalizes the scores between the two exams. Residency programs could then filter applications based only on the single normalized score.
This approach makes sense from the viewpoint that it would reduce the pressure on osteopathic students to take the USMLE, Bryan Carmody, MD, an outspoken critic of various current training policies, told Medscape Medical News. But it could also have serious disadvantages, he said.
For one thing, only osteopathic students can take the COMLEX-USA exam, he noted. If they don’t like their score, they can then take the USMLE test to get a higher score — an option that allopathic students don’t have. It’s not clear that they’d be prevented from doing this under the UGRC recommendation.
Second, he said, osteopathic students, on average, don’t do as well as allopathic students on the UMSLE exam. If they only take the COMLEX-USA test, they’re competing against other students who don’t do as well on tests as allopathic students do. If their scores were normalized with those of the USMLE test takers, they’d gain an unfair advantage against students who can only take the USMLE, including international medical graduates.
Although Carmody admitted that osteopathic students face a harder challenge than allopathic students in matching to residency programs, he said that the UGRC approach to the licensing exams might actually penalize them further. As a result of the scores of the two exams being averaged, residency program directors might discount the scores of all osteopathic students.