Advertisement
Research Article| Volume 10, ISSUE 1, SUPPLEMENT , S4-S9, January 2003

Download started.

Ok

Resident Recruitment

      A radiology residency program is only as good as its residents. Successful recruitment, therefore, is the necessary first step to a good program. One prerequisite to successful recruitment is institutional and program accreditation by the Accreditation Council for Graduate Medical Education (ACGME). ACGME accreditation serves a number of critical purposes, not the least of which is the demonstration of commitment to educational excellence. A program needs a way to identify and select residents who would be the best matches for the department, in a mutually beneficial sense. Once the department has established its criteria for resident selection, the resident selection committee can take advantage of modern computerized databases, using filters to identify appropriate applicants. After identifying the resident applicants it wants to train, the program submits its ranked list to the National Residency Matching Program (NRMP), which finds the best possible fit among applicants and programs. A residency program following these guidelines should prove successful in resident recruitment, which is the first step in training the capable radiologists of tomorrow.

      ACGME Institution and Program Recruitment Guidelines and Requirements

      The ACGME is a voluntary organization comprising five member organizations: the American Board of Medical Specialties, the American Hospital Association, the American Medical Association, the Association of American Medical Colleges, and the Council of Medical Specialty Societies. While the accreditation of residency programs is a voluntary process, a program must acquire and maintain accreditation to receive reimbursement from the federal government for training residents. Moreover, ACGME accreditation establishes and maintains standards that demonstrate to the public, current and potential future residents, medical specialty boards, and state licensing agencies that the given accredited program is in substantial compliance with ACGME standards and has assumed responsibility to educate its residents.
      Although there are many guidelines and standards to which a program must adhere, this article deals only with those that are relevant and important to the recruitment process. First, the institution and the program must jointly state, in writing, their policies and procedures for the identification and selection of eligible residents. Applicants are eligible for appointment as residents if they are graduates of U.S. or Canadian medical schools accredited by the Liaison Committee on Medical Education, or graduates of U.S. osteopathic medical colleges accredited by the American Osteopathic Association (

      Resident eligibility and selection. ACGME institutional requirements. Available at: www.acgme.org/IRC/Ircpr702.asp. Accessed June 23, 2002

      ). Graduates of international medical schools are eligible for appointment if they hold a current certificate from the Educational Commission for Foreign Medical Graduates (ECFMG) or a full and unrestricted state medical license, or if they complete a Fifth Pathway program provided by a medical school accredited by the Liaison Committee on Medical Education.
      ECFMG certification requires documentation of successful completion of and graduation from a medical school listed in the World Directory of Medical Schools (
      • World Health Organization
      ); passing scores on the United States Medical Licensing Examination (USMLE) steps 1 and 2; demonstration of English proficiency, as indicated by passage of the Test of English as a Foreign Language examination; and a passing score on the ECFMG Clinical Skills Assessment. All international medical school graduates applying for medical residency in the United States, regardless of their citizenship status, must fulfill these requirements. In addition, international medical school graduates who are neither U.S. citizens nor permanent residents must acquire the correct visa from the U.S. immigration service to participate in ACGME training programs.
      The ACGME has established specific institutional requirements to facilitate and monitor all aspects of residency education. These requisites include the existence of both institution and program education committees that must meet regularly, establish administrative policies, provide oversight of the educational objectives, monitor the adequacy of resources, and assure that an adequate evaluation system is in place.
      Several other articles in this Supplement address the ACGME guidelines and requirements in greater detail: They discuss the qualifications and responsibilities of the program director, program facilities and resources, and educational program (
      • Mainiero MB
      Responsibilities of the program director.
      ); program policies regarding resident supervision, duty hours, and scholarly activity (
      • Collins J
      Program documents: policies and guidelines.
      ); and evaluation of residents, faculty members, and the program (
      • Collins J
      Internal program review.
      ). New and experienced program directors are encouraged to review the ACGME institutional and program requirements, reproduced in Appendixes 1 and 2 (
      Appendix 1: ACGME program requirements for residency education in diagnostic radiology; and Appendix 2: ACGME institutional requirements.
      ), and to visit the ACGME Web site periodically for policy updates and announcements.

      Electronic Residency Application Service

      The Electronic Residency Application Service (ERAS) has standardized and simplified the residency training application process for applicants and programs. Developed by the Association of American Medical Colleges, the ERAS was pilot tested in 1995 in a number of obstetrics and gynecology residency programs, with remarkable success; by 2002, the ERAS was available to all postgraduate year 1 and 2 level residency programs in all specialties. The ERAS has four components: the residency applicant's Web-based software, the dean's office's software, the program director's workstation software, and the ERAS post office. The ERAS post office provides applicants and medical schools an efficient, reliable mechanism for transmitting, electronically and securely, the various components of applications to residency programs. The components of a particular application are made available for download from the ERAS Web site to the residency programs selected by the applicant, with download being prompted by electronic request from the program directors.
      Program directors can download the ERAS program director's workstation system software by accessing the ERAS Web site (www.aamc.org/audienceeras.htm) and entering the program user name and password provided by the Association of American Medical Colleges in the program director's kit. The kit also includes a detailed workstation manual, which will guide the new program director and/or residency coordinator through the setup and use of the program director's workstation.
      Before the ERAS was available, many programs used institution-specific paper applications and numerous supporting documents that were mailed separately. Since deans' letters were routinely not received until early November, interview committee members had only 10–14 days to review completed applications and identify applicants who would be offered interviews. There was little opportunity for repeated review of files or routine review by multiple reviewers. Remarkably, despite the considerable power, flexibility, and ease of use of the ERAS, many programs still do not take advantage of this service to filter, score, and sort applications electronically but instead use it only to generate paper (hard) copies of applications for manual review (
      • Longmaid III, HE
      A “how-to” user's guide to exploiting ERAS potential in resident application assessment.
      ).
      The ERAS program director's software is a powerful, simple, and flexible tool for reviewing applications electronically. It offers program directors the capacity to filter or sort applications by using one or more preset or customized criteria (eg, U.S. medical school graduate, USMLE scores above or below a particular level, or Alpha Omega Alpha [AOA] medical honor society member). Program directors can also score application documents by using the data fields provided to assign scores to the applicant's transcript, dean's letter, letters of recommendation, and personal statement.
      Efforts to identify which factors in resident applications will best predict resident performance have yielded inconsistent results. Early work (
      • Yindra KJ
      • Rosenfeld PS
      • Donnelly MB
      Medical school achievements as predictors of residency performance.
      ,
      • Keck JW
      • Arnold L
      • Willoughby L
      • Calkins V
      Efficacy of cognitive/noncognitive measures in predicting resident physician performaqnce.
      ) suggested a correlation between scores on part 2 of the National Board of Medical Examiners examination (a USMLE equivalent) and success in residency; in contrast, Weiss et al (
      • Weiss ST
      • Rosa RM
      • Jofe T
      • Munoz B
      A prospective evaluation of performance during the first year of the medical residency.
      ) noted a complete lack of correlation between AOA membership and resident performance. Friedman (
      • Friedman RB
      Sounding board. Fantasy land (editorial).
      ), in a humorous but piercing commentary, likened letters of reference, including deans' letters, to products of fantasy, saying that their authors were afraid to “commit truth” in their evaluations. Wood et al (
      • Wood PS
      • Smith WL
      • Altmaier EM
      • Tarico VS
      • Franken Jr, EA
      A prospective study of cognitive and noncognitive selection criteria as predictors of resident performance.
      ) found that noncognitive factors, such as conscientiousness and interpersonal skills, were as important as cognitive factors in predicting resident behavior, and that standardized tools such as National Board of Medical Examiners/USMLE scores were not helpful. In an accompanying editorial, Hillman (
      • Hillman BJ
      Residency selection: you can't always get what you want.
      ) observed that what is valued by faculty may differ from center to center. This observation was subsequently confirmed in separate surveys by Grantham (
      • Grantham JR
      Radiology resident selection: results of a survey.
      ) and Longmaid (
      • Longmaid III, HE
      ERAS use by radiology residency directors: results of a survey.
      ), who discovered important differences among programs in the number and type of criteria used in the radiology resident selection process.
      For the mutual benefit of the applicants and the programs, each program should establish, in writing, quantifiable criteria on the basis of which residency applicants will be selected for interviews and/or assigned a final ranking. For example, at the Beth Israel Deaconess Medical Center, Boston, Mass, each resident selection committee member separately weights each of 13 different application components according to their relative importance. The numeric values assigned each component by the various committee members are then averaged, and the results for all components are reviewed by the full committee.
      The assignment of a numeric value to each application component requires the development of scoring criteria for all components (except USMLE results, which are already reported in numeric terms). For purposes of standardization, each variable is given a value from 1 to 10. Additional scored data fields are provided for sustained extracurricular activities, sustained research, undergraduate and medical schools, and “other.”
      The ERAS automatically multiplies each data field score by its respective weighting, and sums the products by means of the “composite score” function, with 100 being a perfect score. Each of the 10 scored variables (including USMLE score) represents 2%–20% of the composite score; and with no single component dominating the composite, the result is a balanced quantitative assessment of all applications. Applications are then ranked by composite score. Interviews are offered to applicants based on their composite score ranking and on any compelling positive or negative notes made in the “Comment” section of the program director's work page.
      The standardized scoring of residency applications promotes fair and equitable assessment of all applicants' files and permits the most qualified applicant subgroup to be identified for interview offers. The scoring system described here has several advantages: (a) It works within the ERAS, enhancing administrative efficiency by eliminating the need to develop additional spreadsheets; (b) it provides more thorough applicant profiling; (c) it permits documents to be scored as they are received, so that file review is not delayed; (d) it accurately reflects the consensus of the interview committee; and (e) it permits dynamic reshuffling of the ranking by composite score if the component weighting is changed.
      Once candidates are selected for interview, they must all be considered equally in a fair and unbiased process. The members of the selection committee involved in the interview process should review the basic tenets of appropriate interviewing techniques. Two important principles are asking only questions relevant to the interviewee's candidacy and asking only for information necessary to resident selection (
      • Schultz HJ
      Applicant interviewing practices.
      ). In addition, the interviewer must take great care to ask the same questions of all candidates, regardless of gender or ethnicity; interviewers must be mindful that prospective candidates may perceive impromptu personal inquiries as inappropriate and intrusive. Specifically, an interviewer should not initiate questions about age, marital status, family planning or pregnancy issues, issues of child care, religion or creed, ethnic or minority status, or physical limitations. In my experience, open-ended questions that allow candidates to demonstrate their capacity for self-reflection, maturity, and interpersonal skills are particularly useful for identifying the best candidates in a group of competitive applicants. Applicants should be given the opportunity to interact freely with radiology residents without faculty presence and should be encouraged to contact the program or individual faculty members or residents about questions or interests after the interview. Most experienced program directors actively involve residents in the interview process.

      National Residency Matching Program

      Once the application and the interview process are completed, the residency programs and applicants submit their preferences, in the form of a rank-order list, to the NRMP, a private, nonprofit organization that aims to provide “an impartial venue for matching applicants' and programs' preferences for each other consistently” (

      About the NRMP. Available at: www.nrmp.org/about_nrmp/. Accessed March 13, 2002

      ). The NRMP uses a matching algorithm to pair applicants and programs. The NRMP Web site provides a detailed description of how the process functions, summarized in the following excerpt:The NRMP matching algorithm uses the preferences expressed in the Rank Order Lists submitted by applicants and programs to place individuals into positions. The process starts off with an attempt to place an applicant into the program indicated as most preferred on that applicant's list. If the applicant cannot be matched to this first choice program, an attempt is then made to place the applicant into the second choice program, and so on, until the applicant obtains a tentative match, or all the applicant's choices have been exhausted (

      How the match algorithm works. Available at: www.nrmp.org/res_match/algorithms.html. Accessed June 26, 2002

      ).
      The NRMP offers tracking into two types of positions: categorical and advanced. A program can choose to offer a categorical position if the required clinical year of training is built into the residency, making it a 5-year residency position. Most positions in radiology residency programs are in the advanced category, and a candidate enters such a position after the completion of a clinical internship year.
      Program directors should be mindful of the yearly NRMP match schedule, which is posted at the NRMP Web site (

      NRMP main match schedule for 2003. Available at: www.nrmp.org/res_match/yearly.html

      ). Through their institutional graduate medical education offices, programs commit to fill their residency positions through the NRMP matching process. Programs must submit any changes in their quotas to the NRMP by the end of January of the match year, and rank-order lists are due by mid-February. In mid-March, the NRMP posts the results of the matching on secure, access-controlled Web sites. After that, programs send contract letters to the applicants who have matched with them.

      ACGME Requirements for Diagnostic Radiology Residency Education

      The diagnostic radiology residency, according to ACGME standards (

      Program requirements for residency education in diagnostic radiology. Available at: www.acgme.org/req/420pr701.asp. Accessed March 4, 2002

      ), must include 5 years of “clinically oriented graduate medical education, of which 4 years must be in diagnostic radiology.” The clinical year must be fulfilled in a program in internal medicine, pediatrics, surgery or surgical specialties, obstetrics and gynecology, neurology, family practice, or emergency medicine, or in a transitional-year program, accredited by the ACGME or the Royal College of Physicians and Surgeons of Canada. Radiology programs offering a 5-year categorical position, in which the clinical year is part of the core residency and not a separate ACGME-accredited year, must assure the quality of clinical training in that year. In such circumstances, the clinical year should be completed within the first 24 months of training. Additional requirements include the following: (a) at least 42 of the 48 months in radiology must be in the parent or integrated institution(s); (b) the maximum training period in any subspecialty is 12 months; and (c) the minimum training period in nuclear medicine is 6 months.
      The type of clinical internship a resident completes before beginning radiology training appears to affect the radiology residency program's level of reimbursement from the Centers for Medicare and Medicaid Services (CMS) (Health Care Financing Administration [HCFA]) for that resident's training. Training institutions receive CMS funding to train residents, with the amount often exceeding $100,000 per resident per year. Institutions are increasingly sensitive to the possibility of reduced reimbursement for their training of residents who do not qualify for 100% reimbursement status because of previous training choices. In a January 21, 2002 letter to all diagnostic radiology program directors, M. Paul Capp, MD, executive director of the American Board of Radiology, summarized his understanding of the prevailing CMS policy, based on an April 26, 1999, letter from Nancy-Ann Min DeParle, administrator of the Department of Health and Human Services (

      DeParle NAM. Letter to M Paul Capp, MD, Executive Director of the American Board of Radiology. Available at www.apdr.org/pdf_files/funding-clarification.pdf. Accessed October 5, 2002

      ). Given the timeliness and importance of this issue, Dr Capp's letter is quoted verbatim here:If a resident takes the PGY-1 [postgraduate year 1] as a transitional year, then the following four years of Diagnostic Radiology will be reimbursed by HCFA at the 100% level, for a total of five years.If a resident takes a primary care internship of any type that requires three or four years of training, and that year is followed by four years of Diagnostic Radiology, then HCFA will reimburse 100% for the three or four years, followed by 50% reimbursement for the remaining training in Diagnostic Radiology.If the clinical year is incorporated into the five years of total training and is identified as five years of training in Diagnostic Radiology by that institution, then HCFA “most likely” will pay at the 100% level for five years. In my discussions with them in this particular case, if the individual starts off the five years in the primary care specialty for one or two months, then it's not clear what they will pay for. But one could argue that since the entire five years is designated in Diagnostic Radiology (and the green book requires five years of training), HCFA most likely will pay at the 100% level for the five years. In this situation I would suggest that training programs start the resident in Diagnostic Radiology for at least one or two months before entertaining any rotation in-patient care.
      Dr Capp concluded by advising that all medical students pursuing diagnostic radiology should take a declared transitional year or a clinical year in a program where the requirement for training is at least 5 years (eg, surgery).

      Diagnostic Radiology Residency Programs in the United States

      In March 2002, more than 850 applicants were matched into diagnostic radiology year 1 positions, most of whom started their radiology training in July 2003 as postgraduate year 2 residents. As of June 2002, there were 3,799 residents in 193 ACGME-accredited diagnostic radiology residency programs (

      Reports: number of programs by specialty. Available at: www.acgme.org/adspublic/reports/Aspecialty_prognum.asp. Accessed June 24, 2002

      ), with 181 programs declaring university base or affiliation and the other 12 citing no medical school affiliation (

      List of ACGME accredited programs and sponsoring institutions. Available at: www.acgme.org/adspublic. Accessed June 24, 2002

      ). The minimum ACGME-required number of radiology residents per year in any one residency program is two, for a total of eight residents in a 4-year program. The maximum number of residents in a 4-year program is determined by several factors, including a minimum of “one full-time equivalent physician faculty member at the parent and integrated institutions for every resident in training in the program”; no fewer than 7,000 radiologic examinations per year per resident; and sufficient subspecialty volume and patient variety to “ensure that residents gain experience in the full range of radiologic examinations, procedures, and interpretations” (

      Program requirements for residency education in diagnostic radiology. Available at: www.acgme.org/req/420pr701.asp. Accessed March 4, 2002

      ).
      If the program wishes to expand its complement of residents, it may do so at the time of its application for reaccreditation or by separate petition to the ACGME Residency Review Committee for Diagnostic Radiology. Once the program receives ACGME approval for expansion, it must then work with its own institutional graduate medical education committee to determine whether the total number of filled resident and fellowship positions in the institution is within the institutional cap established by the CMS. If the total resident and fellow complement exceeds the approved number for the institution, the institution will not be reimbursed by the CMS for the excess positions.
      Because the regulations governing federal reimbursement of direct and indirect costs of resident training are complex, a program director pursuing additional residency positions should review the financial implications with both the director of the institutional graduate medical education committee and a representative of the institution's cost and payment center. Both are likely to be familiar with current CMS rules governing reimbursement. Given the current and projected unmet demand for diagnostic radiologists, national radiology organizations such as the American Board of Radiology and the American College of Radiology are working to identify new ways to increase the number and funding of training positions in diagnostic radiology. At the same time, the ACGME requires sponsoring institutions to develop and maintain contingency plans for protecting residents currently in training, in the unlikely event that a training institution must reduce its resident complement or close its training program.

      Summary

      This article has introduced the reader to the critical components of successful recruitment of radiology residents. With particular attention to the ACGME institutional and program requirements regarding resident recruitment, and an explanation of the support systems (ERAS and NRMP) currently available to those involved in applicant review and selection, the article has sought to delineate a sensible approach to recruitment. Successful recruiters have mastered the essentials of these programs and have learned to adapt the programs to their needs. As new program directors work with their departments' resident selection committees, they will identify the factors that faculty and current residents cite as most important in the successful selection of new residents. By structuring the application review process, exploiting the power of the ERAS, and crafting a purposeful and friendly interview process, radiology residency directors can find and recruit the residents who best match their programs.

      Acknowledgements

      I thank Parker Rider-Longmaid for his patient transcription and editing of this document and for his uncanny capacity for bringing his father out of the computer darkness and Web black holes. I express my deep gratitude to Herbert Gramm, MD, former radiology program director at the New England Deaconess Hospital, Boston, Mass, and to Ferris Hall, MD, former radiology program director at the Beth Israel Hospital and Beth Israel Deaconess Medical Center, for their wisdom and mentoring. I also thank Richard Jennette, program coordinator at the Beth Israel Deaconess Medical Center, for his quiet, unflappable patience and organizational talents; indeed, a capable program coordinator is probably second only to the residents in bringing success to any program.

      References

      References

      1. Resident eligibility and selection. ACGME institutional requirements. Available at: www.acgme.org/IRC/Ircpr702.asp. Accessed June 23, 2002

        • World Health Organization
        World directory of medical schools. 7th ed. World Health Organization, Geneva, Switzerland2002
        • Mainiero MB
        Responsibilities of the program director.
        Acad Radiol. 2003; 10: S16-S20
        • Collins J
        Program documents: policies and guidelines.
        Acad Radiol. 2003; 10: S21-S23
        • Collins J
        Internal program review.
        Acad Radiol. 2003; 10: S44-S47
      2. Appendix 1: ACGME program requirements for residency education in diagnostic radiology; and Appendix 2: ACGME institutional requirements.
        Acad Radiol. 2003; 10: S102-S115
        • Longmaid III, HE
        A “how-to” user's guide to exploiting ERAS potential in resident application assessment.
        Acad Radiol. 2000; 7: 1047
        • Yindra KJ
        • Rosenfeld PS
        • Donnelly MB
        Medical school achievements as predictors of residency performance.
        J Med Educ. 1988; 63: 356-363
        • Keck JW
        • Arnold L
        • Willoughby L
        • Calkins V
        Efficacy of cognitive/noncognitive measures in predicting resident physician performaqnce.
        J Med Educ. 1979; 54: 759-765
        • Weiss ST
        • Rosa RM
        • Jofe T
        • Munoz B
        A prospective evaluation of performance during the first year of the medical residency.
        J Med Educ. 1984; 59: 967-968
        • Friedman RB
        Sounding board. Fantasy land (editorial).
        N Engl J Med. 1983; 308: 651-653
        • Wood PS
        • Smith WL
        • Altmaier EM
        • Tarico VS
        • Franken Jr, EA
        A prospective study of cognitive and noncognitive selection criteria as predictors of resident performance.
        Invest Radiol. 1990; 25: 855-859
        • Hillman BJ
        Residency selection: you can't always get what you want.
        Invest Radiol. 1990; 25: 761-762
        • Grantham JR
        Radiology resident selection: results of a survey.
        Invest Radiol. 1993; 28: 99-101
        • Longmaid III, HE
        ERAS use by radiology residency directors: results of a survey.
        Acad Radiol. 2000; 7: 1047
        • Schultz HJ
        Applicant interviewing practices.
        in: Program manual. 6th ed. Association of Program Directors in Internal Medicine, Washington, DC2002: 205-207
      3. About the NRMP. Available at: www.nrmp.org/about_nrmp/. Accessed March 13, 2002

      4. How the match algorithm works. Available at: www.nrmp.org/res_match/algorithms.html. Accessed June 26, 2002

      5. NRMP main match schedule for 2003. Available at: www.nrmp.org/res_match/yearly.html

      6. Program requirements for residency education in diagnostic radiology. Available at: www.acgme.org/req/420pr701.asp. Accessed March 4, 2002

      7. DeParle NAM. Letter to M Paul Capp, MD, Executive Director of the American Board of Radiology. Available at www.apdr.org/pdf_files/funding-clarification.pdf. Accessed October 5, 2002

      8. Reports: number of programs by specialty. Available at: www.acgme.org/adspublic/reports/Aspecialty_prognum.asp. Accessed June 24, 2002

      9. List of ACGME accredited programs and sponsoring institutions. Available at: www.acgme.org/adspublic. Accessed June 24, 2002