J Acad Ophthalmol 2014; 07(01): e026-e030
DOI: 10.1055/s-0034-1396411
Original Article
Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

Development of an Objective Structured Clinical Examination to Assess Medical Student Competence in the Ocular Examination

Daniel W. Knoch
1   Department of Ophthalmology and Visual Sciences, University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin
› Author Affiliations
Further Information

Address for correspondence

Daniel W. Knoch, MD
Department of Ophthalmology and Visual Sciences
University of Wisconsin School of Medicine and Public Health
2870 University Avenue, Suite 102, Madison, WI 53705

Publication History

Publication Date:
23 December 2014 (online)

 

Abstract

Purpose Competently performing the ocular examination is an essential skill of every graduating medical student. Assessment of competence in performing the ocular examination is equally essential. Unfortunately, there is a paucity of tools useful in assessing clinical competence in ophthalmology. The Objective Structured Clinical Examination (OSCE) format has been shown to be a useful tool to assess clinical competence in other academic settings. It is hypothesized that the OSCE format can be used to assess clinical competence in performing the ocular examination. This study presents the novel use of the OSCE format for assessing competence in obtaining a history of headache and performing the ocular examination by third and fourth year medical students.

Methods Observational design was used to assess the competence of third and fourth year medical students in taking a brief history on the chief complaint of headache and performing the ocular examination. The ophthalmology OSCE was used after a one-week ophthalmology clerkship at the University of Wisconsin-Madison School of Medicine and Public Health. Standardized patients were trained by ophthalmology staff prior to the OSCE. The standardized patients were not dilated for the direct ophthalmoscope portion of the examination.

Results Students were graded on their performance in obtaining a history of headache and performing the ocular examination. The ocular examination included assessment of ocular motility, pupils, using confrontation to find a visual field defect, and using the direct ophthalmoscope to match a standardized patient's optic nerve to one of four photos in the room. A check box system was used to assess the student's competence. This was reported as pass, marginal, or fail. Failing students were remediated by an ophthalmologist. A total of 384 students took the exam from 2008 to 2012. Overall, 84% received a passing score, 11% received a marginal score, and 5% received a failing score.

Conclusion Establishment of competence and assessment of competence continue to be a major focus of medical student education. The OSCE format provides a standardized testing platform to assess competence with performing the ocular examination. Use of standardized patients provides a more natural format than paper-based or other methods of assessment.


#

The Objective Structured Clinical Examination (OSCE) has been established as a reliable and effective way to evaluate competence in medical education. It was first introduced in 1975 as a series of rotating stations and has been used in multiple diverse educational environments including evaluation of medical students and residents.[1] [2] In addition, the OSCE format is used in the Clinical Skills (CS) portion of the Step 2 USMLE examination.

Evaluation of medical students' competence in ophthalmology is usually performed by paper-based scenarios. While written case scenarios can assess knowledge, they are poor at evaluating clinical competence.[3] Correct use of the direct ophthalmoscope and performance of an ocular examination is an essential skill of every graduating medical student and can be crucial in cases such as papilledema.[4] Paper-based scenarios cannot assess competence in performing the ocular examination. Web-based[5] and simulator-based[6] evaluation of clinical skills do provide reproducible skills-based evaluation, but they lack the natural setting provided by a standardized patient encounter. In addition, standardized patients provide an accurate and reproducible method for evaluation of medical students' clinical skills.[7] The purpose of this report is to illustrate the development of a novel and robust ophthalmology OSCE to evaluate medical student competence in obtaining a history of headache and performing the ocular examination by third and fourth year medical students after a one-week ophthalmology clerkship.

Methods

Ophthalmology at the University of Wisconsin-Madison School of Medicine and Public Health required a one-week clerkship until May 2013. It was part of a six-week Neurosciences Clerkship which also included rotations in Neurology, Neurosurgery, Neuroradiology, and Rehabilitation Medicine. Students could choose to take the Neurosciences Clerkship in either their third or fourth year of medical school.

Prior to 2008, evaluation of medical student performance on the ophthalmology clerkship included faculty evaluation of students in clinic, faculty evaluation of student performance in small group sessions/case presentations, a multiple choice test, and a paper-based OSCE on papilledema. While these performance measures provided a good evaluation of a student's knowledge, it was felt that they did not provide a good measure of a student's clinical skills. Competent use of the direct ophthalmoscope and performance of the ocular examination is an essential skill of every graduating medical student. Therefore, in 2008, an ophthalmology standardized patient-based OSCE was added in place of the paper-based papilledema OSCE. The ophthalmology OSCE is part of a three-station Neurosciences OSCE. Neurology, Rehabilitation Medicine, Neurosurgery, and Ophthalmology alternate, having a station on the OSCE.

Development of the ophthalmology OSCE began with careful delineation of the ophthalmology history and physical exam skills that every graduating medical student should be able to perform. A case of headache was chosen since this is a problem commonly seen by primary care providers and emphasizes a situation in which examination of the ocular fundus would be a natural part of the physical examination. The author then taught standardized patients how to present the history of headache as well as how to act during the physical examination. Standardized patients were also taught how to feign a predetermined visual field defect. A photo of the right optic nerve of each standardized patient was taken and the author examined each standardized patient to ensure that there was no large discrepancy in difficulty in examining the optic nerves between standardized patients. It was decided that the standardized patients would not be dilated for the OSCE to more fully simulate a primary care type of clinical encounter.

Students are given 10 minutes to complete the ophthalmology OSCE. A door scenario ([Fig. 1]) is placed on the outside closed door of the standardized patient encounter and the student reads the scenario prior to entering the room. The student then performs a history and ocular examination. As [Fig. 1] shows, the student is also asked to choose a photo of the standardized patient's right optic nerve. There are four photos to choose from. [Fig. 2] shows a representative photo.

Zoom Image
Fig. 1 Door scenario used during the OSCE. Students read this before entering the room.
Zoom Image
Fig. 2 Representative photo of a standardized patient's optic nerve. The students match the standardized patient's optic nerve to one of four photos in the room.

Students are graded on a checklist ([Figs. 3] [4] [5] [6]), with their overall performance reported as pass, marginal, or fail. History taking, confrontational visual field testing, pupillary testing, use of the direct ophthalmoscope, and communication skills are emphasized on the checklist. Students are able to miss up to three items on the history checklist. Students receive a pass if they miss one item on the physical examination checklist. They receive a marginal score if they miss two items. They receive a failing grade for the physical examination portion if they miss three or more items. They receive an overall failing grade if they fail either the history or physical examination portion of the checklist.

Zoom Image
Fig. 3 History skills checklist.
Zoom Image
Fig. 4 Physical examination checklist.
Zoom Image
Fig. 5 Overall performance checklist. This is used to assess the overall quality of the students' physical examination skills.
Zoom Image
Fig. 6 Overall communication checklist. This is used to assess the overall quality of the students' history taking skills.

The OSCE is video recorded and graded remotely. The author and one other faculty member grade the OSCE. It is important to note that when the OSCE was first implemented, the author reviewed 10 of the other faculty member's students to ensure inter-rater reliability. The author also reviews the video of any student that fails. Students who fail are remediated by a separate faculty member. This involves performance of the ocular examination with the faculty member as well as examination of the faculty member's optic nerve.


#

Results

All medical students were able to complete the scheduled tasks within the allotted 10 minutes. As can be seen in [Table 1], 384 students have taken the OSCE since 2008. [Table 1] illustrates the pass, marginal, and fail rates over the academic years from 2008 to 2012. Overall, 84% of the students have passed, 11% received a marginal score, and 5% failed. Nearly all the medical students who have taken the OSCE since 2008 passed the history portion of the OSCE. All of the students who failed the OSCE were successfully remediated.

Table 1

OSCE results 2008–2012

Academic year

Pass

Marginal

Fail

2008–2009 (N = 114 students)

110 (96.5%)

4 (3.5%)

0 (0%)

2009–2010 (N = 75 students)

65 (87%)

8 (10%)

2 (3%)

2010–2011 (N = 98 students)

79 (81%)

12 (12%)

7 (7%)

2011–2012 (N = 97 students)

69 (71%)

18 (19%)

10 (10%)

Overall (N = 384 students)

323 (84%)

42 (11%)

19 (5%)


#

Discussion

Background

The OSCE format has been used to evaluate clinical skills in many diverse academic settings.[1] [2] To the author's knowledge, this is the first paper to describe a robust ophthalmology OSCE to assess competence in performing the ocular examination in the United States. The OSCE format is also used on the CS portion of the Step 2 USMLE examination and clinical scenarios on the CS exam are presented in which competent use of the direct ophthalmoscope could be needed (i.e., neurological cases). Implementation of the ophthalmology standardized patient-based OSCE at the University of Wisconsin-Madison School of Medicine and Public Health started in 2008. It provides an evaluation tool of clinical competence that cannot be addressed by paper-based tests.


#

Future Directions

One of the most positive aspects of the OSCE format is the flexibility that it provides. For example, changing the number or complexity of the photos used can increase or decrease the difficulty of the exam. In addition, it has been shown that it is possible to train standardized patients to complete the checklist in an accurate manner rather than have a separate grader.[8] This would greatly decrease the amount of work needed to grade the OSCE.


#

Other Applications

Another benefit of having a robust OSCE such as this is that ophthalmology education continues to compete with other educational activities at medical schools across the nation.[9] [10] An ophthalmology OSCE is easily adapted to school-wide graduation skills examinations. For example, the ophthalmology OSCE was adapted to the Year-End Professional Skills Assessment (YEPSA) that all the third-year medical students take at the end of their third year at the University of Wisconsin-Madison School of Medicine and Public Health. Trends could be followed should the amount of ophthalmology education be increased or decreased, and if performance is deficient, evaluation data could be used to emphasize the need for increased ophthalmology education.


#

Final Thoughts

In 2007, the Association of University Professors in Ophthalmology Medical Student Educators Task Force developed a list of core ophthalmology curriculum for medical students (which can be found at http://www.aupomse.org). This was later endorsed by the American Academy of Ophthalmology in 2008. Core examination skills include evaluation of pupils, ocular motility, confrontational visual fields, and funduscopy. The ophthalmology OSCE provides a reliable educational tool to evaluate competence in performing these essential clinical skills.


#
#
#
  • References

  • 1 Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. BMJ 1975; 1 (5955) 447-451
  • 2 Jain SS, Nadler S, Eyles M, Kirshblum S, DeLisa JA, Smith A. Development of an objective structured clinical examination (OSCE) for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1997; 76 (2) 102-106
  • 3 Jones TV, Gerrity MS, Earp J. Written case simulations: do they predict physicians' behavior?. J Clin Epidemiol 1990; 43 (8) 805-815
  • 4 Benbassat J, Polak BCP, Javitt JC. Objectives of teaching direct ophthalmoscopy to medical students. Acta Ophthalmol (Copenh) 2012; 90 (6) 503-507
  • 5 Asman P, Lindén C. Internet-based assessment of medical students' ophthalmoscopy skills. Acta Ophthalmol (Copenh) 2010; 88 (8) 854-857
  • 6 Mottow-Lippa L, Boker JR. Simulator assessment of funduscopic skills in three consecutive medical school classes. J Acad Ophthalmol 2009; 2 (1) 1-5
  • 7 Tamblyn RM, Klass DJ, Schnabl GK, Kopelow ML. The accuracy of standardized patient presentation. Med Educ 1991; 25 (2) 100-109
  • 8 Vu NV, Marcy MM, Colliver JA, Verhulst SJ, Travis TA, Barrows HS. Standardized (simulated) patients' accuracy in recording clinical performance check-list items. Med Educ 1992; 26 (2) 99-104
  • 9 Mottow-Lippa L. Restoring Ophthalmology to the Mainstream Medical School Curriculum. J Acad Ophthalmol 2008; 1 (2) 62-64
  • 10 Mottow-Lippa L. Ophthalmology in the medical school curriculum: reestablishing our value and effecting change. Ophthalmology 2009; 116 (7) 1235-1236 , 1236.e1

Address for correspondence

Daniel W. Knoch, MD
Department of Ophthalmology and Visual Sciences
University of Wisconsin School of Medicine and Public Health
2870 University Avenue, Suite 102, Madison, WI 53705

  • References

  • 1 Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. BMJ 1975; 1 (5955) 447-451
  • 2 Jain SS, Nadler S, Eyles M, Kirshblum S, DeLisa JA, Smith A. Development of an objective structured clinical examination (OSCE) for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1997; 76 (2) 102-106
  • 3 Jones TV, Gerrity MS, Earp J. Written case simulations: do they predict physicians' behavior?. J Clin Epidemiol 1990; 43 (8) 805-815
  • 4 Benbassat J, Polak BCP, Javitt JC. Objectives of teaching direct ophthalmoscopy to medical students. Acta Ophthalmol (Copenh) 2012; 90 (6) 503-507
  • 5 Asman P, Lindén C. Internet-based assessment of medical students' ophthalmoscopy skills. Acta Ophthalmol (Copenh) 2010; 88 (8) 854-857
  • 6 Mottow-Lippa L, Boker JR. Simulator assessment of funduscopic skills in three consecutive medical school classes. J Acad Ophthalmol 2009; 2 (1) 1-5
  • 7 Tamblyn RM, Klass DJ, Schnabl GK, Kopelow ML. The accuracy of standardized patient presentation. Med Educ 1991; 25 (2) 100-109
  • 8 Vu NV, Marcy MM, Colliver JA, Verhulst SJ, Travis TA, Barrows HS. Standardized (simulated) patients' accuracy in recording clinical performance check-list items. Med Educ 1992; 26 (2) 99-104
  • 9 Mottow-Lippa L. Restoring Ophthalmology to the Mainstream Medical School Curriculum. J Acad Ophthalmol 2008; 1 (2) 62-64
  • 10 Mottow-Lippa L. Ophthalmology in the medical school curriculum: reestablishing our value and effecting change. Ophthalmology 2009; 116 (7) 1235-1236 , 1236.e1

Zoom Image
Fig. 1 Door scenario used during the OSCE. Students read this before entering the room.
Zoom Image
Fig. 2 Representative photo of a standardized patient's optic nerve. The students match the standardized patient's optic nerve to one of four photos in the room.
Zoom Image
Fig. 3 History skills checklist.
Zoom Image
Fig. 4 Physical examination checklist.
Zoom Image
Fig. 5 Overall performance checklist. This is used to assess the overall quality of the students' physical examination skills.
Zoom Image
Fig. 6 Overall communication checklist. This is used to assess the overall quality of the students' history taking skills.