University of Iowa Health Care

Ophthalmology and Visual Sciences

EyeRounds.org

Managing the Competencies in Ophthalmology

Acknowledgement: This work was supported in part by an unrestricted grant from Research to Prevent Blindness, Inc., N.Y. , N.Y. A hard copy version of more detailed published versions of the materials contained on this website can be found in the following articles:

  1. Lee AG, Carter KC. Managing the new mandate in resident education: a blueprint for translating a national mandate into local compliance. Ophthalmology 111:1807-12, 2004.
  2. Lee AG. The impact of the new competencies in ophthalmology. Surv Ophthalmol 48:651-62, 2003.

Frequently asked questions

1. What are the driving forces behind the ACGME competency mandate?
There is increasing external pressure on graduate medical educators to reform and restructure the educational process. These outside forces include the general public, government regulatory agencies, public policy bodies, specialty and professional boards, and third party payers. Each of these players has a different stake in the process and the outcome.*
2. What is the ACGME?
The Accreditation Council for Graduate Medical Education (ACGME) is a private professional organization that is responsible for the accreditation of the 7,800 residency education programs including ophthalmology. The ACGME accredits institutions to provide graduate medical education. It does not certify individuals as competent. This is the role of the individual training programs and their respective specialty boards (e.g., the American Board of Ophthalmology).*
3. Why do we need the competencies?
The external stakeholders want proof of quality and evidence for confidence in in the training, assessment, and certification of doctors in the U.S. Third party payers, insurance companies, and the business world (the market) demand professional competency in practice based and systems based care including knowledge and application of sound business principles, utilization review, managed care, and cost control. The "Government" and the "Public" expect accountability, responsibility, and proof of quality from physicians and their educators. Moreover, the certifying bodies and professional organizations (e.g. Council of Medical Specialty Societies, the ACGME), boards (e.g., American Board of Medical Specialties, The American Board of Ophthalmology) are responsible to protect the "public". The patients are the primary constituency of the credentialing organizations. David Leach MD has summarized the need for the competencies by emphasizing that public trust and confidence in the profession of medicine "depends not just on my (individual) competency but our (specialty) competency". There has been a developing consensus that the resident education process must be redefined and in a sense, completely reinvented to meet the new demands. (3)
4. What is the rationale for these six competencies?
The idea of "competencies" is not a new one and the concept has evolved over time with contributions from numerous different sources. (5-18) The Council on Graduate Medical Education (COGME) summary resource paper included competencies in domains that are very similar to the final ACGME competencies. Likewise, the Pew Health Professions Commission enumerated twenty-one competencies with significant overlap with the ACGME competencies (Table 1). Other specialty organizations in medicine including the Federated Council for Internal Medicine (Task Force on the Internal Medicine Residency Curriculum), National Council of State Boards of Nursing, the American Association of Colleges of Osteopathic Medicine, and the American Academy of Family Physicians have made specific recommendations and developed curricula for their own set of competencies. Halpern et al. synthesized nine major reports on these physician competencies (15) and summarized the results in ten curriculum domains (Table 2). *
5. What are the six competencies?

The ACGME in the late 1990s developed the six competencies based upon (16-17) an extensive review process including published curricula, reports, educational literature, surveys, and interviews; consultations with a representative cross section of the key external stakeholders (e.g., government, public and private foundations, corporate chief executives, presidents of universities, residency review committees, and focus groups of residents, program directors, practicing physicians, allied health personnel, and the public); and consensus building over time with the internal stakeholders (e.g., physicians, educators). The ACGME and the American Board of Medical Specialties (ABMS) adopted the same six competencies below (Table 3):*

  • Patient care
  • Medical knowledge
  • Practice-based learning and improvement
  • Interpersonal and communication skills
  • Professionalism
  • Systems-based practice
6. Where is surgical competence on the list?
To this list, the American Board of Ophthalmology (ABO) has added surgical competence. The precise subcompetencies and specifics for the surgical competency have not been published yet.
7. What is the timeline for the ACGME competency mandate?

The ACGME proposed that the competencies be implemented in three phases:

  • Phase 1: Identification of general competencies applicable to doctors across all of the specialties in medicine
  • Phase 2: Development of the measurement tools (the toolbox) to assess the six competencies
  • Phase 3: Construction of a support network for graduate medical education using these measurements to improve the process.
8. What tools will be necessary in the "tool box" for measuring the competencies?
New tools will compose the "toolbox" of assessment of the ACGME competencies. Teaching and assessing these competencies will require psychometrically valid, reliable, and easy to use tools. The traditional model of residency education has relied most heavily on three basic tools: 1) multiple choice question written examinations, 2) oral examinations, and 3) qualitative global ratings by rotation preceptors. These three tools alone are probably not sufficient to adequately measure the new competencies. The ACGME Outcome project in a joint initiative with the AMBS has described the "toolbox "of assessment methods in detail on their web site in 2000 and the reader is referred there for further details. (16-17)*
9. What are the general principles for evaluating the tools?
The multiple measurement tools have to be evaluated and tested "in the field" for validity, reliability, feasibility, and generalizability. (19-20) The ACGME "toolbox" however contains many educational instruments that have yet to be proven in busy residency education programs and few have been tested in ophthalmology programs.
10. What could be in the toolbox?

Although written and oral examinations are the mainstays of knowledge assessment, ophthalmologists should be aware of alternative tools for evaluating residents. I have summarized briefly the contents of the toolbox available to educators with their associated strengths and weaknesses. (Table 4).*

Written examinations

The easiest and currently the most utilized measurement tool is the written examination. Written exams can cover a broad fund of knowledge from a core curriculum. Residents currently take the ophthalmology in service exam (the Ophthalmology Knowledge and Assessment Program) and graduates sitting for board certification take the written qualifying examination for the American Board of Ophthalmology. The results of the ABO written exam and the OKAP may be used as one benchmark for ophthalmology program effectiveness in one competency, medical knowledge.

The advantages of written testing are familiarity for tester and testee, ability to test large numbers of residents simultaneously, standardization of the testing format, ease of grading, objectivity, reproducibility, and psychometric validity. Multiple choice exams can be quantitatively scored that can be benchmarked internally and externally.

Short and long answer tests including essay format are more difficult to score than multiple choice testing and are more time intensive. The advantages over multiple choice tests are that more information is obtained in the examinee’s own words, more in-depth probing of decision making and fund of knowledge is possible; and there is less "rewarding" of guessing. In multiple choice format, the answer is by necessity contained in the test question. Short answer or essay formats force the examinee to create an answer rather than simply select one.

All of the written test formats have difficulty measuring qualitative aspects of patient-physician interaction, communication and interpersonal skills, and professionalism in a written examination format. In addition, there may not be a correlation of medical knowledge about surgery and actual surgical skills. Scott et al. found that the American Board of Surgery In-training examination measured knowledge but did not correlate with technical skill or operative performance. (21)

Oral examinations

Oral examinations have intrinsic advantages over written tests.(17) Currently many programs employ intramural oral examinations at the conclusion of specific rotations. The American Board of Ophthalmology uses standardized case-based or patient management problem vignettes in their oral examination. The oral format advantages include a "face to face" interaction with the ability to probe the learner’s thought process, problem solving ability, and fund of knowledge, to gauge active participation in decision making, and to integrate visual aids and tester provided information in real time. Case vignettes including history, photos, and test results can be used as props for the testing. Checklists are often used in conjunction with oral testing to insure completion of specific responses.

The disadvantages of oral tests however are they are time consuming, require a lot of personnel to administer, and are difficult to standardize and grade. Case specificity, defined as performance on one case that may or may not be a predictor for different cases, is a limiting factor in case based oral testing. Oral examiners require formal training in test procedure to deter the inappropriate use of body language such as"cueing" or intra-test positive or negative feedback. Although an oral examination may allow assessement of communication skills and deeper probing of an examinee’s fund of knowledge and application of information, one potential disadvantage of oral examination is that it derives from the same fundamental core of information (i.e medical knowledge) as the written exam.

Standardized and simulated examination

Simulated examinations can use cadavers, anatomical models or mannequins, computer-based or virtual reality patient management problems, role playing, or other simulations to create a testing situation. The advantages of simulated exercises are the ease and convenience of models or computers. The "real life" advantages of a "hands on" standardized patient and the interaction with a life patient however are lost in simulations. (17, 22-33)

Simulations can also test the decision making process itself outside of medical content or context. Strategic Management Simulation is one such tool that has been employed to measure different components of decision making and function (e.g., speed of response, task orientation, initiative, breadth of approach to problems, planning capacity, and strategy) in response to complex task assignments. Satish et al. tested the simulator and found data that was similar to traditional faculty ratings of residents. The disadvantages of simulations are that they are time consuming to develop and grade. (23)

Standardized clinical exams provide reproducible menus for testing and assessment that can be multi-stationed encounters. Standardized examinations can include history and physical exam technique, communication and interpersonal skills, ability to summarize, interpret, and perform tests or procedures, formulate a differential diagnosis and plan. Standardized real or healthy but simulated patients can test for technique, appropriateness of interaction, completeness of specific tasks, and expected behaviors. The standardized patient method is particularly useful for assessing specific manual skills or techniques and has high accuracy (up to 90%), reproducibility and scoring validity.(26)

Objective structured clinical examination (OSCE)

Several forms of an Objective Structured Clinical Examination (OSCE) exist and have been tested and validated in many specialties. (17, 27-31) Checklists can be employed to document and score specific behaviors and tasks. The use of checklists in this setting reduces grader variability and can produce better inter-rater reliability.

Up to 14- to 18 test items (and up to 6 hours of testing) may be required however to achieve reliable and reproducible results. (19) The disadvantages of the standardized clinical examinations including the OSCE are the development and implementation costs, preparation and testing time, cost, physical space and logistical issues, and personnel and evaluator training requirements. Training and maintaining adequate numbers of standardized or simulated patients can be difficult and costly. (26) Colliver et al. measured the reliability and efficiency of components of clinical competence using standardized patients and estimated that over 100 cases (range 45 to 170) would be needed to assess certain competencies with sufficient reliability (0.80) and validity. (24) This high case requirement could be a significant limiting factor in generating valid indicators of resident performance.

Chart review, chart stimulated recall, chart audit

Chart review and chart stimulated recall methods are auditing techniques that can be used to assess "real world" decision-making. (17) Retrospective chart audits can compare abstracted information to pre-existing protocols in an explicit review process including differential diagnoses, diagnostic testing, interpretation of results, and treatment. Chart reviews can measure the "real world " medical documentation skills for completeness, problem solving, and cost effectiveness. The American Board of Ophthalmology currently employs a self-selected chart review process for specific diagnoses as part of their re-certification process. Sample sizes of between 8 and 10 records are probably sufficient for reliability testing of a diagnosis or procedure and one study reported that 6-8 random office chart selections might be a sufficient sample to evaluate patient care. (17)

In chart stimulated recall, the charts are selected to be representative for content. The physician examiner can use the chart data to probe the learners’ knowledge base, decision making, application and interpretation of "real patient" test results, and the rationale for testing and treatment options. Reliability scores of between 0.65 and 0.88 can be obtained with this technique. (17) The disadvantages include issues of patient confidentiality, the process can be time consuming, extensive training is required for the evaluator, and the scoring can be difficult to quantify or standardize across patients. In addition, the examinee’s record has no external benchmark or "gold standard" for the accuracy of the physical examination findings or the true outcome of the patient.

Case presentations, case-based learning

Case presentations (e.g. grand rounds) provide information about organizational and communication skills, synthesis and presentation of data, and problem solving. (17, 35) Most programs require residents to present at intramural conferences and grand rounds but few programs assess their performance or provide feedback to presenters. Case presentations can be time consuming, are difficult to standardize and grade, and may not be representative of "real time" patient care abilities. Case-based or problem-based learning and interactive grand rounds can provide information about individual resident communication and interpersonal skills, self-directed learning, and problem-solving ability. (34) Case specficity is a limiting factor however in using grand rounds as a gauge of resident competence.

Evidence-based medicine and literature reviews

"Journal clubs" can provide information about evidence-based medicine, literature review skills, epidemiology and statistics knowledge base, and application of practice based and systems based learning. On-line or virtual "journal clubs" can allow multiple residents to review selected articles as well as provide interactive feedback and built-in assessment. Most residency programs have some form of journal club but few record performance results or provide feedback on the results. Measurement of the application of these skills should be incorporated across other tools.

Qualitative reviews and global ratings

Qualitative reviews and global ratings are the most commonly used tool in residency programs. The advantages include widespread acceptance and familiarity with the technique, ability to assess interpersonal and communication skills, professionalism, and aspects of patient and systems-based care. (17,35-40) Global ratings however judge general abilities rather than specific behaviors as in the checklist format. The supervisor ratings are generally retrospective over a period of time (e.g. one rotation). Global ratings are easy to use, can provide be used to provide feedback, and can be semi-quantitative but are highly subjective. The disadvantages of rating forms are significant. Graders often avoid the high and low end of the scale leading to grade inflation and evaluators tend to provide more accurate but less lenient in ratings when the evaluation provides formative feedback rather than summative feedback. (41) This particular problem becomes more apparent when a "face to face" discussion of the evaluation score is required and when the score is used for a summative "high-stakes","go-no go", promotion decision. There is often dissociation between written comments especially negative comments and actual scores. Inter-rater and intra-rater subjectivity and bias, especially if graders are untrained, can lead to "easy graders" and "hard graders". Variability of resident clinical experience, limited numbers of direct observations,"luck of the draw", and difficulty with reproducibility and reliability significantly limit the utility of qualitative evaluations in measuring the competencies.

Global evaluation forms can be used to assess a number of competencies simultaneously. The caveat however is in overreliance upon this single tool. Overly onerous, time-consuming, or too lengthy global evaluation forms that simply tack on the new competencies as a "laundry list " to an existing rotation evaluation form probably will not be sufficient to meet the need. This type of "super-form" may place an additional and probably unrealistic burden for such a high stakes encounter on rotation supervisors to subjectively assess "too much in too little time". In addition, rotations are often designed to meet clinical service needs rather than educational needs and measurement of service content may not have construct validity. (43-44) Documenting the reliability and validity of these forms is critical from a credentialling and possibly legal standpoint if they are to be used in a summative and high stakes decision like promotion.The decision not to graduate a resident based on the results of these "super-forms" especially if they are collected or completed in a cursory or haphazard fashion is not likely to be accepted as adequate. Conversely, attendings with little opportunity for or documentation of direct observation of a resident may result in the graduation of residents with substantial deficiencies. It would be difficult to develop a checklist that could reliably assess intangible competencies such as ethical standards or interactions with colleagues and support staff.

One variation on the qualitative review method is known as the 360-degree evaluation. (17) The "360" includes multiple evaluations from multiple individuals (e.g. nurses, allied health professionals, peers, supervisors, patients, patient families) within the sphere of operation of the person being evaluated. This is in distinction to the typical supervisor or preceptor qualitative review method. Usually a survey or questionnaire is used to collect data on performance in areas that are not amenable to written testing (e.g. teamwork, communication, management style, decision-making). Reliability for a "360" may be as high as 0.90. (17) Patient surveys may assist in the evaluation of general qualitative aspects of personal interaction, communications skills, courtesy, interest, and empathy. Reliability may be as high as 0.90 for a properly performed institutional patient survey. These patient surveys may however require multiple responses (20 to 40) to be reliable and are time intensive to collect and report. (17)

Direct observation

Direct observation in an apprenticeship model is the basis for most global evaluations in ophthalmology residency (e.g. rotations). (42) Unless faculty members are willing and able to supervise residents through all stages of patient encounters it may be difficult to judge the resident’s skills in these areas accurately. Hauer described one method of direct observation, clinical evaluation exercise, that documents in a standardized manner a single observed student history and exam during a real patient encounter from start to finish. (42) Video (e.g., closed circuit television) or audio-taped review of surgical cases or patient encounters provides the advantages of direct observation with less time constraint and the artificial environment created by a supervised exam. Specific skills can be examined and reviewed in a non-judgmental and non-time pressured environment. The disadvantages include cost, training of personal, artificial environment created by recording, equipment and confidentiality issues.

Portfolio

A portfolio can collate and document evidence of resident learning and achievement. (17) Attendance records of resident participation in educational activities can easily be incorporated into the portfolio. Attendance records however do not measure ability, motivation, or participation. The portfolio might include any of the following:

  • A written learning plan
  • Lists of critical or sentinel patient care events (positive or negative)
  • Written and oral test scores
  • Records of continuing medical education
  • Logs of number and type of procedures, surgical operations, or individual patient encounters
  • Documentation of completion of independent learning
  • Global ratings results
  • Video or audio-tape recordings
  • Patient letters of advocacy or commendation
  • Examples of ethical dilemmas solved
  • Counseling sessions with patients.

Surgical logs are required by many programs and the Residency Review Committee. They may be electronic and computerized systems can aid in collection and maintenance of databases. Logs however can be inaccurate, incomplete, or even falsified. Surgical logs like attendance records only provide quantitative information on "how many procedures" rather than qualitative assessment data of "how well were the procedures done". (46) Regular review of the surgical and medical logs and the learner portfolio can insure appropriate content, amount, and timing of learning. (46-48) The portfolio can include formative and summative data and is one of the most useful tools for collecting in a quantitative manner data on competencies that are difficult to measure through other means such as practice-based improvement, professionalism.

 

11. Why do we need an implementation matrix for the competencies?
The implementation of the competencies requires a stepwise re-engineering of the resident education process. The ACGME has proposed the following concepts in the assessment of any instrument or approach in the toolbox. (17) The tools should provide valid and reliable data, should be feasible, should apply to the assessment circumstances, and should provide valuable, new and useful information that facilitates teaching and learning. There should be consistency between curriculum/program objectives and the assessment and there should be a representative sampling of the educational domains of interest. Multi-dimensional assessment approaches/instruments, multiple observations to improve reliability and precision, and multiple observers/raters should be employed. The performance should be assessed according to pre-specified standards or criteria and the assessment must be fair. *
11a. What specific steps are required by the ACGME?

The specific steps in the implementation plan include the following (16-17):

Step 1 : Programs will be asked to develop specific curricula and written goals and objectives for teaching the competencies.

Step 2 : The programs should be developing and using an assortment of different evaluation tools ("the tool box") to assess resident competence in the various areas.

Step 3 : After using the tools for a period of time (testing phase), programs will need to develop means to document the reliability and validity of the assessment tools. There will be a need to use progressively better tools (and discard poorer tools) and provide evidence of the use of evaluation data in assessing the competencies.

Step 4: Programs will be required to link educational outcomes with overall program improvement including assessments that improve aspects of the program that apply to the external (other than the residents) stakeholders (e.g. patient surveys, employers, graduates). (16,17)

11b. How could ophthalmology proceed?
The paradigm shift to "competency based residency training" is here and must be implemented. Folberg emphasized that "refusal is not an option" and he proposed a series of steps to prevent organizational anxiety by developing skills; to prevent apathy by providing incentives; to prevent frustration by providing resources; and to prevent "business as usual" and maturation arrest by reinventing the educational proces. (48)
11c. What is the University of Iowa Department of Ophthalmology plan?

Step 1: Blueprint for curriculum development, learning objectives, and organizational goals ("Define what needs to be tested") (33)

The curriculum, objectives, and assessment strategy should be defined at the onset. We have to move away from the traditional curriculum that is based upon "learning what teachers choose to teach" to the new competency-based curriculum based upon "learning what they need to know to take care of patients". (52) The purpose of the process, specific learning objectives and goals, and a timeline have to be documented in an advance blueprint before implementation and "construction" of the tools begins.

This blueprint should include a test matrix of the "best tools" for content and possibly even benchmarks for individual programs to follow. (53) The new curriculum will have to work within the existing time and scheduling constraints. The competencies should be integrated and "piggy-backed" on to existing structure rather than adding a new curriculum into an already over-crowded and time limited 36-month schedule. Existing curriculum materials and rotation schedules should be adapted to meet the needs of the competencies rather than creating overlapping and duplicitous material. Lee et al previously described some guidelines for designing an optimum resident rotation schedule in ophthalmology and the competencies need to be incorporated into the rotation schedule design. (43) Lee and Koch surveyed residents at one training program (Baylor College of Medicine) and identified clinically important skills rated by residents and correlated them with the level of training received. (44) There was significant discrepancy in the curriculum requirements, level of training, and the subjective rating of clinical importance in several items including managed care, practice management. Thus, curriculum design should take into account the clinical value and prioritization of content, a formal needs assessment, and the requirements of the learners in order to maximize the learning within a fixed training period. (46)

A novel solution to the time limit imposed by the 36-month training period is to develop flexibility in the current "rotation-based" format. Although the 36-month training period would remain the same, residents might rotate through more or less quickly through the core rotations based upon performance. (54)

There needs to be explicit "buy in" of the concept and defined accountability before proceeding. Participants including both educators and learners need to know the background, historical perspective, driving forces, rationale, and motivation for the reinvention process. There needs to be a clear mandate for change and "buy in" for the concept of reform from the top of the academic leadership hierarchy from deans and chairmen down to individual faculty and learners. Complete acceptance of the blueprint must be established before proceeding. Departments and their faculties must assume "ownership" and accountability for the process on an individual resident level. Teaching and learning should become the responsibility of the individual learner, the faculty, the program director, and the department. Most importantly, the burden should not fall completely on the program director.

 Step 2: Implementing the blueprint, developing and using the tools

Don’t reinvent the wheel. Many organizations have already devoted considerable time and effort to the development of non-subspecialty specific curricula that could be modified for use by ophthalmology training programs. There is no need for duplication of effort and many of the competencies are generalizable across disciplines. Many medical schools and some residencies have already adapted some tools (e.g. OSCE) for assessment for medical students and residents. (28,29,38) In addition, program requirements for some of the competencies cross subspecialty areas and can be met on an institutional basis.

Existing curricula should be modified to reflect the competencies. Lee and Koch developed one curriculum for ophthalmology (the Resident Progress Notebook) that was limited to a "laundry list" format. (44) A separate surgical curriculum was based upon more specific learning objectives (the Surgical Resident Progress Notebook). The Association of University Professors in Ophthalmology also developed a broad-based ophthalmology curriculum. These curricula in ophthalmology require modification to meet the specific requirements of the competencies. A clearinghouse should be established to promote sharing of information, expert advice, and collaborative discussion among educators. The ACGME has already suggested "best methods" for evaluation of the competencies and selected specific tools from the toolbox. They ranked the desirability of the various testing methods for each competency. (17)

Match the right tool to the right job.

A program matrix should match the individual preferred testing tools to specific competencies. Using one tool for every competency such as a super global rating form, over-reliance upon a single observer like the program director, using a single "high stakes" encounter such as the written qualifying exam, or using unreliable, invalid, or "wrong tools" for the job are not valid means of meeting the mandate.

Using models from other specialties will save time and energy. One sample matrix for meeting the seven competencies is below:

  • Patient care OSCE and patient surveys
  • Medical knowledge Written and oral exams
  • Practice based learning Record review, chart audit, portfolios
  • Interpersonal skills OSCE, direct observation, patient surveys
  • Professionalism OSCE and 360 degree global ratings
  • Systems-based practice 360 degree global ratings
  • Surgery Surgical OSCE, video review and portfolio

Use high tech tools ("enabling technologies").

The use of computer technology to assist in the delivery, documentation, and analysis of information will save time and money. (48-51) Programs in ophthalmology already are having difficulty meeting the challenges of an expanding curriculum within a fixed 36-month training period. The body of knowledge in medical and surgical ophthalmology continues to increase but the training period remains at 36 months and the lecture time available during that restricted time period is precious. The use of technologies such as the internet that residents can access at any time or on their own time (weekends, nights) could free programs of the burden of forcing more information into an overcrowded weekly lecture schedule. One such site, (www.gme-core.org), provides excellent Web-based educational modules on the GME core curriculum.

Educate the educators (Develop a core teaching faculty).

Subspecialty faculty members are currently well trained in the "science of medicine" but not the "science of education" including testing and evaluation. Training focusing on education will be essential for the residency directors and faculty members who will be expected to implement the competencies. These individuals need to organize, meet, share information and"share the tools". There will be a need for training of faculty how to teach ("teach the teacher") and how to evaluate ("evaluate the evaluators"). McAvoy et al. described the training program for assessors used in the United Kingdom by the General Medical Council. (47) The program includes learning the basic principles and tools for assessment, supervised practice in assessment methods during a peer review visit, use of actors simulating third party interviewees and trained doctors role playing the assessee, and specific training in report writing and legal issues. Developing and implementing the competencies for the residents implies a parallel process of faculty competencies in testing and assessment and faculty have to be taught how to perform the testing.

Dissemination and sharing of information is critical and stimulates collaboration between groups working towards solving similar problems.(55-60) Although individual program needs will require modification of the specific tools in the toolbox, it will be important that tools that work be shared among all the program directors. A cadre of dedicated educators will be necessary to carry the process over from year to year. The turnover rate for program directors including ophthalmology is quite high with a relatively short training director tenure of only 2-3 years. The typical program director is often the most junior member of a faculty, usually begins their tenure without formal training in education or evaluation, is often uncompensated for the extra time commitment, is being asked to do "more and more with less and less", and is often given "requirements without resources". These trends will have to be reversed if the competency effort is to succeed.

Step 3: Testing the tools

Keep the tools sharp and working.

Modification, testing of reliability and validity, and improvement in the tools themselves will be an ongoing process. Part of the ACGME mandate is not just the development of tools but proof that they actually work by linking results with evidence that they improve the educational process. (16,17) We will, in a sense, need tools to measure the efficacy of the tools. The tools will have to be evaluated for reliability, validity, and feasibility.

Step 4: Linking the tools to improvements in the educational process

Development of new tools and discarding old tools

The tools that work or don’t work should be evaluated in the context of the original blueprint. The traditional model has relied too heavily on two tools with significant disadvantages, the written multiple choice exam and the qualitative global preceptor review. These are familiar but insufficient tools and new tools will have to be developed. In the new model, "paper and pencil" testing and evaluation forms will have to give way to innovative assessment methods that infer competence by more closely resembling "real world" patient encounters. These new tools should replace outdated, inefficient, or unwieldy tools. The cycle of assessment and measurement with re-assessment and re-measurement should be repeated over time leading to a self-sustaining process of improvement. (61-62)

Summary
Ophthalmology like all the medical specialties will have to address the issues surrounding implementation of the new competencies in residency education. A coordinated, thoughtful, standardized effort will be required to meet the challenge. This effort will require the combined input of the members of all the national ophthalmology organizations including the American Board of Ophthalmology, the American Academy of Ophthalmology, and the chairman and program directors in ophthalmology (the Association of University Professors of Ophthalmology). Collaboration and communication will reduce duplication of effort and insure that the process is of high quality and fair.
References
  1. O’Neil EH and the Pew Health Professions Commission. Recreating Health Professional Practice for a New Century. The fourth report of the Pew Professions Commission. San Francisco , CA : Pew Health Professions Commission, 1998.
  2. Jacobs MO, Mott PD. Physician characteristics and training emphasis considered desirable by leaders of HMOs. J Med Educ 1987;62:725-731.
  3. Leach D. The ACGME competencies. Substance or form? J Am Coll Surg 2001;192:396-398.
  4. Council on Graduate Medical Education. Summary of Resource Paper. Preparing Learners for Practice in a Managed Care Environment. Rockville , MD. Health Resources and Services Administration, 1997. (www.cogme.gov/resource.htm):(Last viewed 6/5/02 )
  5. American Academy of Family Physicians. Recommended Curriculum Guidelines for Family Practice Residents. Care Management. Leawood , KS , AAFP, 1996. http://www.aafp.org/edu/guide (Last viewed 6/5/02)
  6. Association of American Medical Colleges (AAMC). Contemporary Issues in Medicine-Medical Infomatics and Population Health: Report II of the Medical School Objectives Project. Acad Med 1999;74:130-141.
  7. Federated Council for Internal Medicine. Task Force on the Internal Medicine Residency Curriculum. Graduate Medical Education in Internal Medicine. A resource guide to curriculum development. 1997, Philadelphia , PA. FCIM, 1997.
  8. Informatics Panel and the Population Health Perspective Panel. Contemporary issues in medicine. Medical informatics and population health: Report II of the medical school objectives project. Acad Med 1999;74:130-141.
  9. Klessig JM, Wolfsthal SD , Levine MA, et al. A pilot survey study to define quality in residency education. Acad Med 2000;75:71-73.
  10. Parboosingh J. Credentialing physicians: Challenges for continuing medical education. J Cont Med Educat Health Profess 2000;20:188-190.
  11. Reisdorff EJ, Hayes, OW, Carlson DJ, Walker GL. Assessing the new general competencies for resident education. A model from an emergency medicine program. Academic Med 2001;76:753-757.
  12. Trunkey D, Botney R. Assessing competency: A tale of two professions. J Am Coll Surg 2001;192:385-395.
  13. Undergraduate Medical Education for the 21 st century. A national demonstration of curriculum innovations to keep pace with a chaning health care environment. Health Resources and Services Administration and American Association of Colleges of Osteopathic Medicine. www.aaocom.org
  14. Yedidia MJ, Gillespie CC, Moore GT. Specific clinical competencies for managing care. JAMA 2000;9:1093-1098.
  15. Halpern R, Lee MY, Boulter PR, Phillips RR. A synthesis of nine major reports on physicians’ competencies for the emerging practice environment. Acad Med 2001;76:606-615.
  16. Accreditation Council for Graduate Medical Education (ACGME) General competencies. www.acgme.org. 2002. (Last viewed 6/5/02 )
  17. Accreditation Council for Graduate Medical Education (ACGME) Outcome project. Toolbox of assessment methods. A product of the joint initiative of the ACGME outcome project of the ACGME and ABMS. www.acgme.org, Version 1.1, 2000. (Last viewed 6/5/02 )
  18. Reinhardt UE. Academic medicine’s financial accountability and responsibility.JAMA 2000;284:1136-1138.
  19. Newble DI, Swanson DB. Psychometric characteristics of the objective structured clinical examination. Med Educ 1988;22:325-334.
  20. Newble DI, Hoare J, Elmslie RG. The validity and reliability of a new examination of the clinical competence of medical students. Med Educ 1981; 15:46 -52.
  21. Scott DJ, Valentine RJ, Bergen PC, et al. Evaluating surgical competence with the American Board of Surgery In-Training Examination, skill testing, and intraoperative assessment. Surgery 2000;128:613-622.
  22. Vu NV, Marcy MM, Colliver JA, et al. Standardized (simulated) patients’ accuracy in recording clinical performance check-list items. Med Educ 1992;26:99-104.
  23. Satish U, Streufert S, Marshall R, et al. Strategic management simulations is a novel way to measure resident competencies. Am J Surg 2001;181:557-561.
  24. Colliver JA, Vu NV , Markwell SJ, et al. Reliability and efficiency of components of clinical competence assessed: five performance-based examinations using standardized patients. Med Educ 1991;25:303-310.
  25. Stillman P, Swanson D. Ensuring the clinical competence of medical school graduates through standardized patients. Arch Intern Med 1987;147:1049-1052.
  26. Tamblyn RM, Klass DJ, Schnabl GK, Kopelow ML. The accuracy of standardized patient presentation. Med Educ 1991;25:100-109.
  27. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447-451.
  28. Jain SS, DeLisa JA, Campagnolo DI. Methods used in the evaluation of clinical competency of physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1994;73:234-239.
  29. Jain SS, Nadler S, Eyles M, et al. Development of an objective structured clinical examination (OSCE) for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1997;76:102-106.
  30. Matsell DG, Hus NMW. Reliability and validity of the objective structured clinical examination in paediatrics. Med Educ 1991;25:293-299.
  31. Petrusa RT, Blacvkwell TA, Ainsworth MA. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Arch Intern Med 1990;150:573-577.
  32. Winkel CP, Reznick RK, Cohen R, Taylor B. Reliability and construct validity of a structured technical skills assessment form. Am J Surg 1994;167:423-27.
  33. Newble D, Dawson B, Dauphinee D, et al. Guidelines for assessing clinical competence. Teach Learn Med 1994;3:213-220.
  34. Schmidt HG, van der Molen HT. Self-reported competency ratings of graduates of a problem-based curriculum. Acad Med 2001;76:466-468.
  35. Gray J. Global ratings scales in residency education. Acad Med 1996;71:S55-63.
  36. Headrick LA, Neuhauser D, Schwab P, et al. Continuous quality improvement and the education of the generalist physician. Acad Med 1995;70:S104-109.
  37. Matthews DA, Feinstein AR. A new instrument for patients’ ratings of physician performance in the hospital setting. J Gen Intern Med 1989;4:14-22.
  38. Mavis BE, Cole BL, Hoppe RB. A survey of student assessment in U.S. Medical Schools: The balance of breadth versus fidelity. Teach Learn in Med 2001;13:74-79.
  39. Noel G, Herbers JE, Caplow M, et al. How well do Internal Medicine faculty members evaluate the clinical skills of residents. Ann Intern Med 1992;117:757-65.
  40. Stritter FT, Bland CJ, Youngblood PL. Determining essential faculty competencies. Teach Learn Med 1991;3:232-238.
  41. Quattlebaum TG. Techniques for evaluating residents and residency programs. Pediatr 1996;98:1277-1283.
  42. Hauer KE. Enhancing feedback to students using the mini-CEX. Acad Med 2000;75:524.
  43. Lee AG, Pelton RW, Koch DD. Designing an optimum rotation schedule for an ophthalmology residency. Acad Med 1997;72:564.
  44. Lee AG, Koch DD. The Baylor Survey of ophthalmology residency graduates: Rationale for reform in residency training. Ophthalm Surg Lasers 1998;29:649-657.
  45. Adler LM, Ware JE, Enelow AJ. Changes in medical interviewing style after instruction with two closed circuit television techniques. J Med Educ 1970;45:21-28.
  46. Lee AG, Koch DD. A standardized curriculum and evaluation system for an ophthalmology residency. Acad Med 1998;73:447-448.
  47. McAvoy PA, McCrorie P, Jolly B, et al. Training the assessors for the General Medical Council’s performance procedures. Med Educ 2001;35 :(Suppl 1):29-35.
  48. Folberg R, Antonioli DA, Alexander CB. Competency-based residency training in pathology. Challenges and opportunities. Hum Pathol 2002;33:3-6.
  49. Gilbert S, Davidson JS. Using the world-wide web to obtain feedback on the quality of surgical residency training. Am J Surg 2000;179:74-75.
  50. Steckler A, Farel A, Bontempi JB, et al. Can health professionals learn qualitative evaluation methods on the World Wide Web? A case example. Health Educ Research 2001;16:735-745.
  51. Civetta JM, Morejon OV, Krton OC, et al. Beyond requirements. Residency management through the internet. Arch Surg 2001;136:412-417.
  52. Yip H-K, Smales RJ, Newsome PRH, et al. Competency-based education in a clinical course in conservative dentistry. Br Dental J 2001;191:517-522.
  53. Lee AG. Benchmarks for ranking and improving ophthalmology residency programs. Arch Ophthalmol 2001;119:1395-6.
  54. Long DM. Competency-based residency training: the next advance in graduate medical education. Acad Med 2000;75:1178-1183.
  55. Murray E, Gruppen L, Catton P, et al. The accountability of clinical education: its definition and assessment. Med Educ 2000;34:871-879.
  56. Mulvey HJ, Ogle-Jewett EA, Cheng TL, Johnson RL. Future of Pediatric Education II (FOPE II) Project, American Academy of Pediatrics. Pediatric Residency Education. Pediatrics 2000;106:323-329.
  57. Cantor JC, Baker LC, Hughes RG. Preparedness for practice. Young physicians views of their professional education. JAMA 1993;270:1035-1040.
  58. Ilott I, Bunch G. Competencies of basic surgical trainees. Ann Roy Coll Surg Engl 1998;80:14-16.
  59. Martin M, Vashisht B, Frezza F, et al. Competency-based instruction in critical invasive skills improves both resident performance and patient safety. Surgery 1998;124:313-317.
  60. Fowell SL, Southgate LJ, Blight JG. Evaluating assessment: the missing link? Med Educ 1999;33:276-281.
  61. Waddell DL, Measurement issues in promoting continued competence. J Cont Educ Nurs 2001;32:102-138.
  62. Lee AG, New mandate in resident education: reply. Ophthalmology 2005;112:1324.
  63. Golnik KC, Lee AG, Carter K. Assessment of ophthalmology resident on-call performance. Ophthalmology 2005;112:1242-1246.

Method of literature search

A Medline (PubMed) literature search was performed using the search terms, "competency-based education’ and "competencies" (2296 articles). The search was limited to the years 1996-2002 (1096 articles) and further limited to "human", "abstract", and "English-language" articles (774 articles). Letters to the editor were included only if they added significant new information. Pre-1996 articles were included if they added historical context or additional valuable information. Abstracts were reviewed and selected papers (62 articles) were obtained from the initial literature search based upon applicability to residency education in ophthalmology. Web sites and non-Medline citations were obtained from selected bibliographies.

Table 1: Pew Commission 21 competencies for the 21st century
Table 1: Pew Commission 21 competencies for the 21st century
  • Embrace a personal ethic of social responsibility and service
  • Exhibit ethical behavior in all professional activities
  • Provide evidence-based, clinically competent care
  • Incorporate the multiple determinants of health in clinical care
  • Apply knowledge of the new schemes
  • Demonstrate critical thinking, reflection, and problem-solving skills
  • Understand the role of primary care
  • Rigorously practice preventive health care
  • Integrate population-based care and services into practice
  • Improve access to health care for those with unmet health needs
  • Practice relationship-centered care with individuals and families
  • Provide culturally sensitive care to a diverse society
  • Partner with communities in health care decisions
  • Use communication and information technology effectively and appropriately
  • Work in interdisciplinary teams
  • Ensure care that balances individual, professional, system, and societal needs
  • Practice leadership
  • Take responsibility for quality of care and health outcomes at all levels
  • Contribute to continuous improvement of the health care system
  • Advocate for public policy that promotes and protects the health of the public
  • Continue to learn and help others learn
Table 2: The ten curriculum domains
Table 2: The ten curriculum domains
  • Health care system overview
  • Population-based care
  • Quality measurement and improvement
  • Medical management
  • Preventive care
  • Physician-patient communication
  • Ethics
  • Teamwork and collaboration
  • Information management and technology
  • Practice management
Table 3: Summary of the six ACGME competencies
Table 3: Summary of the six ACGME competencies
  • Patient care:
    • Provide patient care that is compassionate, appropriate, and effective for the treatment of health problems and promotion of health
    • Communicate effectively and demonstrate caring and respectful behaviors when interacting with patients and families
    • Gather essential and accurate information about their patients
    • Make informed decisions about diagnostic and therapeutic interventions based on patient information and preferences, up-to-date scientifi evidence, and clinical judgment
    • Develop and carry out patient management plans
    • Counsel and educate patients and their families
    • Use information technology to support patient care decisions and patient education
    • Perform competently all medical and invasive procedures considered essential for the area of practice
    • Provide health care services aimed at preventing health problems or maintaining health
    • Work with health care professionals, including those from other disciplines, to provide patient-focused care
  • Medical knowledge
    • Demonstrate knowledge about established and evolving biomedical, clinical, and cognate (e.g. epidemiological and social-behavioral) sciences and the application of this knowledge to patient care
    • Demonstrate an investigatory and analytic thinking approach to clinical situations
    • Know and apply the basic an clinically supportive sciences which are appropriate to their discipline
  • Practice-based learning and improvement
    • Investigate and evaluate their patient care practices, appraise and assimilate scientific evidence, and improve patient care practices
    • Analyze practice experience and perform practice-based improvement activities using a systematic methodology
    • Locate, appraise, and assimilate evidence from scientific studies related to their patients’ health problems
    • Obtain and use information about their own population of patients and the larger population from which their patients are drawn
    • Apply knowledge of study designs and statistical methods to the appraisal of clinical studies and other information on diagnostic and therapeutic effectiveness
    • Use information technology to manage information, access on-line medical information; and support their own education
    • Facilitate the learning of students and other health care professionals
  • Interpersonal and communication skills
    • Demonstrate interpersonal and communication skills that result in effective information exchange and teaming with patients, their families and professional associates
    • Investigate and evaluate their patient care practices, appraise and assimilate scientific evidence, and improve patient care practices
    • Create and sustain a therapeutic and ethically sound relationship with patients
    • Use effective listening skills and elicit and provide information using effective nonverbal, explanatory, questioning and writing skills
    • Work effectively with others as a member or leader of a health care tam or other professional group
  • Professionalism
    • Demonstrate a commitment to carrying out professional responsibilities, adherence to ethical principles, and sensitivity to a diverse patient population
    • Demonstrate respect, compassion, and integrity; a responsiveness to the needs of patients and society that supercedes self-interest; accountability to patients, society, and the profession; and a commitment to excellence and on-going professional development
    • Demonstrate a commitment to ethical principles pertaining to provision or withholding of clinical care, confidentiality of patient information, informed consent, and business practices
    • Demonstrate sensitivity and responsiveness to patients’ culture, age, gender, and disabilities
  • Systems-based practice
    • Demonstrate an awareness of and responsiveness to the larger context and system of health care and the ability to effectively call on system resources to provide care that is of optimal value
    • Understand how their patient care and other professional practices affect other health care professionals, the health care organization, and the larger society and how these elements of the system affect their own practice
    • Know how types of medical practice and delivery systems differ from one another, including methods of controlling health care costs and allocating resources
    • Practice cost-effective health care and resource allocation that does not compromise quality of care
    • Advocate for quality patient care and assist patients in dealing with system complexities
    • Know how to partner with health care managers and health care providers to assess, coordinate, and improve health care and know how these activities can affect system performance
Table 4: Existing models of testing and evaluation (the toolbox)
Table 4: Existing models of testing and evaluation (the toolbox)

1. The Tools

  • Written testing (including proctored exam, open book or Web-based)
  • Objective multiple choice question written testing
  • In training and in-service evaluations
  • Certification examinations
  • Short answer written testing
  • Short, long answer essay testing

2. Patient management questions (case based learning)

  • Written
  • Oral
  • Computer based (web-based)
  • Videotaped encounters
  • Strategic management simulations
  • Key feature questions
  • Structured and unstructured cases
  • Case presentations (grand rounds)

3. Observed case management of simulated or standardized patients

  • In house live examinations
  • In house practice examinations
  • Closed circuit television and videotaped encounters
  • Checklist evaluation of performance
  • Standardized clinical examinations (objective standardized clinical exam-OSCE)
  • Direct observation of real patients (apprenticeship model)

5. Chart audit, review of procedure, operative, or case logs

6. Chart simulated recall (CSR)

7. Qualitative reviews: Supervisor, patient and peer review

  • Global ratings evaluation
  • 360 degree evaluations (allied health, faculty, staff)
  • Patient review or surveys

8. Portfolio (including logs)

Appendix 1: Brief descriptions of major professional, accrediting, and certifying organizations involved in the competencies
Appendix 1: Brief descriptions of major professional, accrediting, and certifying organizations involved in the competencies

Organization

Comment

Accreditation Council for Graduate Medical Education (ACGME)

Private professional organization responsible for the accreditation of the 7,800 residency education programs including ophthalmology

The American Board of Medical Specialties (ABMS)

Organization of 24 approved medical specialty boards that coordinates the activities of its Member Boards and to provide information to others concerning issues involving specialization and certification of medical specialists.

American Board of Ophthalmology (ABO)

Independent, non-profit organization responsible for certifying ophthalmologists (eye physicians and surgeons) in the United States.The ABO was the first American Board established to certify medical specialists and is one of 24 specialty Boards recognized by the ABMS and the American Medical Association (AMA).

The Council on Graduate Medical Education (COGME)

Authorized by Congress in 1986 to provide an ongoing assessment of physician workforce trends, training issues and financing policies, and to recommend appropriate federal and private sector efforts to address identified needs. Appointed individuals (n=17) include representatives of practicing primary care physicians, national and specialty physician organizations, international medical graduates, medical student and house staff associations, schools of medicine and osteopathy, public and private teaching hospitals, health insurers, business, labor and federal representation.

The Council of Medical Specialty Societies (CMSS)

A non-profit, independent forum of 21 medical specialty societies for the discussion by medical specialists of issues of national interest and mutual concern. Membership is limited to those U.S. medical specialty societies that represent diplomates certified by a Board recognized by ABMS.


last updated: 09-21-2007
  Share this page: