CC BY-NC-ND 4.0 · Journal of Academic Ophthalmology 2021; 13(02): e270-e276
DOI: 10.1055/s-0041-1736438
Research Article

Smartphone Compatible versus Conventional Ophthalmoscope: A Randomized Crossover Educational Trial

1   Kingston Health Sciences Centre-Kingston General Hospital Research Institute, Kingston, Ontario, Canada
,
Mark Xu
1   Kingston Health Sciences Centre-Kingston General Hospital Research Institute, Kingston, Ontario, Canada
,
Daisy Liu
1   Kingston Health Sciences Centre-Kingston General Hospital Research Institute, Kingston, Ontario, Canada
,
Jason Kwok
1   Kingston Health Sciences Centre-Kingston General Hospital Research Institute, Kingston, Ontario, Canada
,
Wilma Hopman
1   Kingston Health Sciences Centre-Kingston General Hospital Research Institute, Kingston, Ontario, Canada
,
Isabella Irrcher
1   Kingston Health Sciences Centre-Kingston General Hospital Research Institute, Kingston, Ontario, Canada
,
Stephanie Baxter
1   Kingston Health Sciences Centre-Kingston General Hospital Research Institute, Kingston, Ontario, Canada
› Author Affiliations
Funding This work was funded by a Physicians' Services Incorporated (PSI) Resident Research Grant (Grant #R18-21) to M.X.
 

Abstract

Objective The aim of the study is to compare performance and ease-of-use (EOU) of optic disk assessment using a smartphone direct ophthalmoscope attachment (D-EYE) to the gold standard direct ophthalmoscope (DO).

Design The type of study involved is prospective, randomized, crossover, and educational trial.

Participants The participants involved were first year medical students inexperienced in ophthalmoscopy.

Methods Optic disks of standardized and volunteer patients were examined using the D-EYE and a conventional DO. Optic disk identification, EOU ratings of the devices, self-reported confidence level in their examination with the devices, and estimation of vertical cup-to-disk ratio (VCDR) were compared. Analyses included Chi-square tests, independent samples t-tests, correlations, and multivariable linear regression.

Results Forty-four medical students voluntarily participated in the study. Students using the DO required more attempts (3.57 vs. 2.69, p = 0.010) and time (197.00 vs. 168.02 seconds, p = 0.043) to match the patient's fundus to the correct photograph. Overall EOU between the devices (6.40 vs. 4.79, p < 0.001) and overall confidence in examination (5.65 vs. 4.49, p = 0.003) were greater when using the D-EYE. There were no statistically significant differences in accuracy of VCDR estimations between the two ophthalmoscopes.

Conclusion Smartphone ophthalmoscopy could offer additional learning opportunities in medical education and may be considered in clinical practice by non-specialist physicians given its greater EOU and increased success in visualizing the optic disk.


#

Direct ophthalmoscopy is a critical clinical skill which is part of a routine physical examination allowing the visualization of the fundus.[1] [2] While ophthalmoscopy allows for the diagnosis and screening of diseases including glaucoma, hypertension, and diabetes, it is often performed poorly with minimal confidence by medical students and physicians.[2] [3] [4] [5] [6] [7] The decline in confidence and use of ophthalmoscopy are likely due to the marginalization of ophthalmic medical education and the intrinsic difficulty of this examination technique.[7] [8] [9] [10] [11] [12] However, a competent fundus examination whether it is part of a routine clinical evaluation, or in a busy emergency department or inpatient ward remains a duty of care for non-specialist physicians.[13]

In contrast to the widespread decline of direct ophthalmoscopy, a concurrent increase in smartphone-adapted ophthalmoscopy use has been documented.[14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] In particular, high quality imaging of anterior and posterior segments of the eye can be acquired with the phone's camera when used in conjunction with ophthalmic lenses.[14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] Smartphone images yield clinically comparable results to the standard of digital retinography and/or posterior segment biomicroscopy.[26] Significant agreement in accuracy exists between smartphone ophthalmoscopy and standard slit lamp biomicroscopy when grading diabetic retinopathy and optic nerve cup-to-disk ratios by trained ophthalmologists.[18] [19] As a result, the clinical use of smartphones may carry a significant value given that standard ophthalmoscopy is a difficult skill to master, often poorly performed, and with low confidence.

The smartphone direct ophthalmoscope attachment (D-EYE) provides a funduscopic view and captures images and video of the posterior segment in dilated and undilated eyes.[27] It has been studied in a medical student population for ease-of-use as well as for accuracy and quality of fundus examinations.[28] [29] [30] However, the full clinical and educational value of the D-EYE is not clear. The objectives of this study were to compare the ease-of-use and confidence of ophthalmoscopy-inexperienced medical students in using the D-EYE and the direct ophthalmoscope (DO) in volunteer and standardized patients through fundus photo matching and evaluation of the optic nerve.

Methods

Study Design and Overview

This prospective, randomized, crossover study was conducted at Queen's University in Kingston, Ontario, Canada and was approved by the Queen's University and Affiliated Hospitals Health Sciences Research Ethics Board (TRAQ 6024503). All students provided written informed consent prior to participation.


#

Study Population and Inclusion/Exclusion Criteria

In total, 44 first year Queen's University medical students inexperienced in ophthalmoscopy were enrolled in this study. Students with a self-declared expertise in ophthalmoscopy (direct ophthalmoscopy or D-EYE) were excluded to minimize proficiency bias.

At our institution, a clinical skill session introducing the use of the DO occurs initially in the fall of first year, with further skills training occurring later on in second year of medical training. Data were collected from two separate first year cohorts of students (November 2017 and 2018).

Data Collection and Instruments

The D-EYE generates images using co-axial illumination and a beam splitter in conjunction with the smartphone.[31] It operates in conjunction with the D-EYE app to capture images and videos that can be stored and played back, a function that was allowed during the study.

Prior to randomization, students participated in a 15-minute didactic lecture on the use of both ophthalmoscopes and were provided with a basic amount of knowledge regarding estimating vertical cup to disk ratio (VCDR) and normal disk anatomy. A 30-minute practice session was permitted afterward with guidance from both senior medical students and residents with prior training in the use of both devices. Students were randomized (www.randomizer.org) by ophthalmoscope.

Cohort 1

The first student cohort (n = 26) examined four standardized patients situated in four identical objective structured clinical examination (OSCE)-style examination lanes. Students began in a given room and after completion of the task, moved to the next designated room. Students were allowed 10 minutes to examine both eyes of one standardized patient with their assigned ophthalmoscope (either D-EYE or DO) then crossed over to the next phase of the study to examine the eyes of a different standardized patient using the second ophthalmoscope. In total, students completed 52 fundus examinations with 104 eyes that were used in the analyses. A research assistant was available in each room to address technical or logistic difficulties and to facilitate study execution but were directed to refrain from providing additional instruction during the study. Two of the standardized patients had their left eye dilated, and two had their right eye dilated; this was determined at random utilizing www.randomizer.org.

Following each examination, students were asked to match the live patient's optic nerve to their fundus photograph that appeared amongst a randomly generated nine-photo collage, as previously described.[32] In addition, students were asked to rate the ease-of-use (EOU) of the instrument and confidence level with their examination, and to document the VCDR of each eye in the data collection form ([Fig. 1]). Students were allowed unlimited attempts to match the fundus photo with their examination until a correct match was made. Within the 10-minute time limit, students were also allowed to re-examine the patient as necessary.

Zoom Image
Fig. 1 Data collection form 2018 study cohort. The data collection form for the 2019 study cohort was identical with the exception of an added abnormal/normal disk section. Students were asked to choose from one of four options: unable to assess/did not see disk, unsure, normal or abnormal in response to the prompt “was this an overall normal or abnormal disk.”

#

Cohort 2

The second student cohort followed the same study design with the exception of four additional volunteer patients specifically chosen due to the presence of optic nerve pathology (increased cup-to-disk ratios [n = 2], optic disk drusen [n = 1], disk pallor from previous non-arteritic ischemic optic neuropathy [NAION; n = 1]). Students were aware that some patients had abnormal optic nerve pathology but were unaware of which patients or how many eyes were abnormal. We added this evaluation to the second cohort due to our perceived clinical importance of an overall assessment of optic nerve health by trainees, and eventual generalists, over the accurate determination of the vertical cup-to-disk ratio.

As in cohort 1, each volunteer patient was randomized to have either the right or left eye dilated. Students examined a total of two volunteer patients and two standardized patient (eight eye examinations in total: four dilated eyes, four nondilated eyes). In total, students in cohort 2 examined 144 eyes that were used in the analyses. A total of 124 individual examinations were performed, or 248 eyes including both cohorts ([Fig. 2]).

Zoom Image
Fig. 2 Study flow diagram. *Each medical student participant was randomized to start in the first room using either the D-EYE or DO, and then moved through subsequent rooms in an OSCE style progression designed to alternate left eye/right eye dilation and device used. In each room the medical student examined both eyes of the patient. DO, direct ophthalmoscope; OSCE, objective structured clinical examination.

#
#

Outcomes

The primary outcome of our study was the difference between the medical student's recorded estimation of VCDR as compared with a reference standard (established by an independent ophthalmologist using slit-lamp biomicroscopy and corroborated with disk photographs). Secondary outcomes included the number of attempts and time required to correctly match the disk photograph to one of nine fundus photos, EOU rating, and confidence level with the examination method.

An additional outcome assessed in the second cohort included an assessment of whether students could correctly determine whether or not the optic nerve was grossly normal or abnormal.


#

Sample Size Determination

To detect a difference of 0.1 in VCDR estimation between the two devices, using a standard deviation of 0.14, power of 80%, and statistical significance of p = 0.05 (based on a previous study with a similar scoring system),[12] a minimum of 31 students were required. Two cohorts were recruited to ensure that an adequate number of participants were included in the study.


#

Statistics

Data were collected in an Excel file designed for the study, and imported into IBM SPSS (IBM SPSS Statistics for Windows, Version 25.0. Armonk, NY, 2018) for statistical analysis. The two cohorts were initially compared based on characteristics (age and sex) and outcomes; as there were no significant differences between the cohorts, the data from both cohorts were subsequently pooled to include 44 students (124 examinations in total and 248 eyes). The Shapiro-Wilk test was used to assess the underlying distribution for the continuous data. Chi-square tests (Pearson or the Fisher's Exact test as appropriate) were used to compare frequencies for the categorical data including guessing, self-reported optic disk identification, and abnormal/normal disk identification. Independent samples t-tests or the Mann-Whitney U-tests were used to compare the continuous data (right and left eye EOU, overall device EOU as well as confidence in examination). Spearman correlations were employed to look at associations between the continuous data. The fundus matching data (number of attempts and time in seconds) were also assessed using a multivariable linear regression model to control for eye, dilation and repeated student assessments while examining assessment method. Differences were considered statistically significant if p < 0.05, and no adjustments were made for multiple comparisons.


#
#
#

Results

In total, 44 first year medical students (26 in cohort 1 and 18 in cohort 2) participated in the study. Student demographics are summarized in [Table 1].

Table 1

Baseline characteristics of study students

Characteristics

Cohort 1 (n = 26)

Cohort 2 (n = 18)

Total (n = 44)

Age, mean ± SD (Range)

23.1 ± 3.7

(20–34)

22.49 ± 2.26

(20–29)

22.74 ± 2.97

(20–34)

Gender

 Male n, (%)

12 (46.2%)

9 (50%)

21 (47.7%)

 Female n, (%)

14 (53.8%)

9 (50%)

23 (52.3%)

VCDR Estimations

VCDR estimations, the primary outcome of our study, were not significantly different between the D-EYE and the DO in either the right (p = 0.132) or the left (p = 0.055) eye. Students were able to assign a value to the optic disk more often with the D-EYE versus the DO ([Table 2]).

Table 2

Pooled vertical cup-to-disk ratio results

Examined eye

Ophthalmoscope

Number of measurements

VCDR

Mean difference (SD)

p-Value

Right

D-EYE

44

−0.159 0 0.27

0.132

DO

32

−0.072 0 0.21

Left

D-EYE

41

−0.090 0 0.24

0.055

DO

33

0.020 . 0.25

Abbreviations: D-EYE, Digital Eye Smartphone Attachment; DO, direct ophthalmoscope; SD, standard deviation; VCDR, vertical cup-to-disk ratio.


Note: Means were calculated from measurements where students had selected a VCDR value versus choosing “unsure or did not see disk” from the data collection form ([Fig. 1]). The number of measurements in each group therefore reflect how many examinations were completed where a VCDR could be determined. VCDR mean difference was determined by subtracting the students' estimate of VCDR from the correct reference VCDR.



#

Ease-of-Use and Optic Disk Identification

As shown in [Table 3], students reported seeing the right and left optic disks of the patient more using the D-EYE. Guessing, determined a priori if the student selected “unable to assess/did not see the disk” and/or selected “EOU ≤4” in the data collection form ([Fig. 1]), also occurred less when students used the D-EYE. Finally, overall reported ease-of-use (EOU) when examining the right eye, left eye, overall device EOU representing the student's general impression of the device via Likert scale, and overall confidence in examination was greater when using the D-EYE ([Tables 3] and [4]).

Table 3

Ease of use and guessing

Right eye

Left eye

Ophthalmoscope

D-EYE

( N  = 62)

DO

( N  = 62)

Chi-square

( p -value)

D-EYE ( N  = 62)

DO ( N  = 62)

Chi-square

( p -value)

Guessing (%)

32.3 ( N  = 20)

51.6 ( N  = 32)

4.769 (0.029)

35.5 ( N  = 22)

54.8 ( N  = 34)

4.689 (0.030)

EOU 5+ (%)

75.8 (N = 47)

61.3

(N = 38)

3.030 (0.082)

87.1 ( N  = 54)

59.7 ( N  = 37)

11.933

(0.001)

Device EOU 1–8 (mean score 1 SD)

5.94 ± 1.697

5.13 ± 1.465

0.005

5.95 ± 1.520

5.15 0.1.304

0.002

Abbreviations: D-EYE, digital eye smartphone attachment; DO, direct ophthalmoscope; SD, standard deviation.


Note: Pooled study outcomes comparing DO with D-EYE use, N = 248, each eye N = 124. We did not find any age or gender differences; however, right eye and left eye VCDR performance, RE and LE EOU, overall EOU, and confidence levels were significantly correlated (p <0.01). Guessing was determined by choosing “unable to assess/did not see disk” and/or EOU ≤4 ([Fig. 1]).


Table 4

Overall device ease of use and self-reported confidence in exam

Ophthalmoscope

D-EYE

DO

p-Value

Overall EOU

(mean score ± SD)

6.40 ± 2.30

4.79 ± 2.21

<0.001

Confidence 1–10 (mean score ± SD)

5.65 ± 2.23

4.49 ± 1.96

0.003

Abbreviations: D-EYE, digital eye smartphone attachment; DO, direct ophthalmoscope; SD, standard deviation.


Note: Pooled study outcomes comparing DO with D-EYE use, N = 248, each eye N = 124.



#

Fundus Matching

Students using the DO required more attempts and more time to match the patient's fundus to the correct photograph ([Table 5]). Multivariable linear regression controlling for eye, dilation, and repeated student measurements showed that on average, one additional attempt (B = 0.879, p = 0.009), and 30 more seconds (B = 29.222, p = 0.041) were needed to correctly match the fundus using the DO.

Table 5

Optic disk matching

Parameter

D-EYE

DO

p-Value

Number of attempts

(mean ± SD)

2.69 ± 2.46 (248[a])

3.57 ± 2.87

0.01

Time in seconds

(mean ± SD)

168.02 ± 98.15 (246[a])

197.00 ± 123.56

0.043

Abbreviations: D-EYE, Digital Eye Smartphone Attachment; DO, Direct Ophthalmoscope; SD, standard deviation.


a Measurements represent a pooled dataset from cohort 1 and 2.



#

Abnormal/Normal Disk Identification

In cohort 2, the correct identification of an abnormal disk or a normal disk using either the DO or D-EYE was determined. Those who chose unsure or did not see the disk were not included in the analysis. Significantly more correct matches were made with the D-EYE device when examining the right eye (38.9 vs. 16.7%, p = 0.035 Chi-square test X 2 [1, N = 72] = 4.431). A similar trend was seen for the left eye, (44.4 vs. 30.6%, p = 0.224 Chi-square test X 2 [1, N = 72] = 1.481).


#
#

Discussion

Ophthalmic medical education, and in particular ophthalmoscopy competency, has experienced a steady and concerning marginalization within the medical school curriculum.[2] [4] Traditional ophthalmoscopy is a difficult yet critical skill, but is often poorly performed. Users also consistently report low confidence in its use.[3] Newer smartphone-adapted models may be an important modernization to assist practitioners achieve a competent fundus examination. In this study we found that undergraduate medical students naïve to ophthalmoscopy skills preferred and felt more confident using a smartphone attachment over the DO to visualize the optic disk and identify ophthalmic pathology. They also took less time and made fewer errors with the D-EYE when asked to correctly match a disk photo to a live patient in the examination chair.

While previous studies have shown similar results, no other study has reported the success of the D-EYE device when compared with DO with respect to the number of attempts, and time taken to correctly match a fundus examination to a color photograph panel.[28] [29] [30] This is a unique outcome measure, as it requires a given student to visualize the patient's optic nerve in enough detail to be able to select the identical nerve out of a randomly generated series of images. Optic disk identification as facilitated through an online fundus photography matching program is objective, measurable, and has clinical applicability.[32] [33]

We also found several statistically significant and clinically relevant advantages of the D-EYE device. First, medical students were significantly more successful at visualizing the optic disk using the D-EYE smartphone ophthalmoscope despite having had more experience with the latter in the form of a previous clinical skills workshop and more opportunity to practice with the DO due to the number of available devices during the practice session within the study. Second, even after controlling for left versus right eye and dilation, our results showed that students needed one more attempt to correctly match the fundus image using the DO compared with the D-EYE, and a full 30 seconds longer on average. These findings are not only statistically significant, but also clinically relevant. Less attempts to make a correct selection imply that students were more easily able to see enough fundus details to match disk appearance, vessels, and other optic nerve head features. Lastly, less time to gain an adequate enough view to correctly match the optic disk image may translate into a more efficient posterior eye examination, saving valuable minutes in clinic as well as patient discomfort during a prolonged examination. This time-saving aspect of smartphone ophthalmoscopy is especially desirable in our current practice climate where time pressures make it unlikely that non-specialist clinicians will spend extra few moments struggling with a difficult-to-perform clinical skill.

It should be noted that while the DO has changed minimally over the past several decades, the D-EYE requires an updated attachment as smartphone technology advances ($109 USD for the model-specific housing in addition to the $435 USD for the device itself at the time of publication), introducing a potential financial limitation of the device.

Our study has limitations that warrant mention. First, our initial cohort resulted in a sample size too small to derive meaningful inferences and we therefore elected to recruit a second cohort. Second, VCDR the primary outcome in our study, was not significant. However, the clinical utility of this measure in inexperienced trainees such as first year medical students, or even non-specialist physicians is questionable. We initially chose VCDR estimation as our primary outcome due to the convenient numeric grading system and use in similar studies.[19] [25] Arguably, a more significant measure is the identification of a grossly normal versus abnormal disk which we included in the second cohort and was found to be statistically significant . Enabling the physician to appropriately monitor or refer a patient following funduscopic examination based on gross pathology is arguably more clinically relevant than the estimation of a vertical cup to disk ratio. In light of this finding, in addition to the outperformance of ophthalmoscopy inexperienced first year medical students with the D-EYE, future studies are planned to explore the ease and adoptability of smartphone ophthalmoscopy use in emergency and rural settings where a competent funduscopic examination by a non-specialist physician may save a patient's vision. Even though students were more successful with the D-EYE in identifying abnormal nerves, the overall performance with either modality was poor, emphasizing the difficulty of ophthalmoscopy regardless of method used and the need for user-friendly instruments to aid in the interpretation of the optic nerve. Providing learning opportunities to medical students to practice ophthalmoscopy with a variety of methods of optic disk examination may be advantageous to ensure that they gain familiarity with the different devices and technologies available to them in future practice.


#

Conclusion

With the advent of smartphone ophthalmoscopy, the potential for fundus examinations to be performed proficiently and with confidence by general and emergency practitioners is feasible and should be tested. The added exportability of videos and images with the smartphone-adapted ophthalmoscope would also allow for remote consultations. With these advantages in mind, a future for this technology in medical education, clinical use, and tele-ophthalmology could be considered.


#
#

Conflict of Interest

R.C. reports grants from Physicians' Services Incorporated Grant, during the conduct of the study; D.L. reports grants from Physician's Services Incorporated Grant, during the conduct of the study; I.I. reports grants from Physicians' Services Incorporated Grant, during the conduct of the study; .J.K. reports grants from Physician's Services Incorporated Grant, during the conduct of the study; S.B. reports grants from Physician Services Incorporated, during the conduct of the study; W.H. reports grants from Physician's Services Incorporated Grant, during the conduct of the study; M.X. reports grants from Physicians' Services Incorporated Grant, during the conduct of the study.

  • References

  • 1 Benbassat J, Polak BC, Javitt JC. Objectives of teaching direct ophthalmoscopy to medical students. Acta Ophthalmol 2012; 90 (06) 503-507
  • 2 Mottow-Lippa L. Ophthalmology in the medical school curriculum: reestablishing our value and effecting change. Ophthalmology 2009; 116 (07) 1235-1236 , 1236.e1
  • 3 Gupta RR, Lam WC. Medical students' self-confidence in performing direct ophthalmoscopy in clinical training. Can J Ophthalmol 2006; 41 (02) 169-174
  • 4 Lippa LM, Boker J, Duke A, Amin A. A novel 3-year longitudinal pilot study of medical students' acquisition and retention of screening eye examination skills. Ophthalmology 2006; 113 (01) 133-139
  • 5 Megbelayin EO, Asana EU, Nkanga GD. et al. Evaluation of competence of medical students in performing direct ophthalmoscopy. Niger J Ophthalmol 2014; 22 (02) 73-77
  • 6 Basilious A, Cheng J, Buys YM. Comparison of glaucoma knowledge and referral practices among family physicians with ophthalmologists' expectations. Can J Ophthalmol 2015; 50 (03) 202-208
  • 7 Mackay DD, Garza PS, Bruce BB, Newman NJ, Biousse V. The demise of direct ophthalmoscopy: a modern clinical challenge. Neurol Clin Pract 2015; 5 (02) 150-157
  • 8 Noble J, Somal K, Gill HS, Lam WC. An analysis of undergraduate ophthalmology training in Canada. Can J Ophthalmol 2009; 44 (05) 513-518
  • 9 Shah M, Knoch D, Waxman E. The state of ophthalmology medical student education in the United States and Canada, 2012 through 2013. Ophthalmology 2014; 121 (06) 1160-1163
  • 10 Fan JC, Sherwin T, McGhee CN. Teaching of ophthalmology in undergraduate curricula: a survey of Australasian and Asian medical schools. Clin Exp Ophthalmol 2007; 35 (04) 310-317
  • 11 Baylis O, Murray PI, Dayan M. Undergraduate ophthalmology education—a survey of UK medical schools. Med Teach 2011; 33 (06) 468-471
  • 12 McComiskie JE, Greer RM, Gole GA. Panoptic versus conventional ophthalmoscope. Clin Exp Ophthalmol 2004; 32 (03) 238-242
  • 13 Clarkson JG. Training in ophthalmology is critical for all physicians. Arch Ophthalmol 2003; 121 (09) 1327-1327
  • 14 Dalay S, Umar F, Saeed S. Fundoscopy: a reflection upon medical training?. Clin Teach 2013; 10 (02) 103-106
  • 15 Statista. Number of smartphone users worldwide from 2014 to 2020 (in billions). ; 2017. Accessed November 1, 2019 at: https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/
  • 16 Myung D, Jais A, He L, Blumenkranz MS, Chang RT. 3D printed smartphone indirect lens adapter for rapid, high quality retinal imaging. J Mob Technol Med 2014; 3 (01) 9-15
  • 17 Bastawrous A, Giardini ME, Jordan S. Peek: Portable Eye Examination Kit. The Smartphone Ophthalmoscope. Invest Ophthalmol Vis Sci 2014; 55 (13) 1612
  • 18 Russo A, Morescalchi F, Costagliola C, Delcassi L, Semeraro F. Comparison of smartphone ophthalmoscopy with slit-lamp biomicroscopy for grading diabetic retinopathy. Am J Ophthalmol 2015; 159 (02) 360-364.e1
  • 19 Russo A, Mapham W, Turano R. et al. Comparison of smartphone ophthalmoscopy with slit-lamp biomicroscopy for grading vertical cup-to-disc ratio. J Glaucoma 2016; 25 (09) e777-e781
  • 20 Shanmugam MP, Mishra DK, Madhukumar R, Ramanjulu R, Reddy SY, Rodrigues G. Fundus imaging with a mobile phone: a review of techniques. Indian J Ophthalmol 2014; 62 (09) 960-962
  • 21 Colicchia G, Wiesner H. Looking into the eye with a smartphone. Phys Teach 2015; 53 (02) 106-108
  • 22 Kulendran M, Lim M, Laws G. et al. Surgical smartphone applications across different platforms: their evolution, uses, and users. Surg Innov 2014; 21 (04) 427-440
  • 23 Dyaberi R, Bajantri YB, Khatib ZI, Hedge S, Khanna V. Smartphone indirect ophthalmoscopy: for screening, and documentation of the ocular fundus. J Vis 2015; 1 (01) 13
  • 24 Bastawrous A. Smartphone fundoscopy. Ophthalmology 2012; 119 (02) 432-433.e2 , author reply 433
  • 25 Dao D, Shah N, Tamhankar M. et al. Smartphone Ophthalmoscopy (D-EYE System) for Detection of Optic Nerve Pathology and Cup-to-Disk Ratio in an Outpatient Clinical Setting. Poster session presented at: The Association for Research in Vision and Ophthalmology 2017 Annual Meeting; May 7–11, 2017; Baltimore, MD
  • 26 Vilela MA, Valença FM, Barreto PK, Amaral CE, Pellanda LC. Agreement between retinal images obtained via smartphones and images obtained with retinal cameras or fundoscopic exams—systematic review and meta-analysis. Clin Ophthalmol 2018; 12: 2581-2589
  • 27 D-EYE Smartphone-Based Retinal Imaging System. Accessed November 1, 2019 at: https://www.D-EYEcare.com/en_US/product#features
  • 28 Mamtora S, Sandinha MT, Ajith A, Song A, Steel DHW. Smart phone ophthalmoscopy: a potential replacement for the direct ophthalmoscope. Eye (Lond) 2018; 32 (11) 1766-1771
  • 29 Kim Y, Chao DL. Comparison of smartphone ophthalmoscopy vs conventional direct ophthalmoscopy as a teaching tool for medical students: the COSMOS study. Clin Ophthalmol 2019; 13: 391-401
  • 30 Wu AR, Fouzdar-Jain S, Suh DW. Comparison study of funduscopic examination using a smartphone-based digital ophthalmoscope and the direct ophthalmoscope. J Pediatr Ophthalmol Strabismus 2018; 55 (03) 201-206
  • 31 Russo A, Morescalchi F, Costagliola C, Delcassi L, Semeraro F. A novel device to exploit the smartphone camera for fundus photography. J Ophthalmol 2015; 2015: 823139
  • 32 Bénard-Séguin É, Kwok J, Liao W, Baxter S. Use of a fundus photograph matching program in imparting proficiency in ophthalmoscopy. Can J Ophthalmol 2018; 53 (05) 480-485
  • 33 Kwok J, Liao W, Baxter S. Evaluation of an online peer fundus photograph matching program in teaching direct ophthalmoscopy to medical students. Can J Ophthalmol 2017; 52 (05) 441-446

Address for correspondence

Stephanie Baxter, MD, FRCSC
Department of Ophthalmology, Kingston Health Sciences Centre-Hotel Dieu Hospital Site and Queen's University
166 Brock Street, Kingston, ON, K7L 5G2

Publication History

Received: 30 January 2021

Accepted: 30 July 2021

Article published online:
25 December 2021

© 2021. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Thieme Medical Publishers, Inc.
333 Seventh Avenue, 18th Floor, New York, NY 10001, USA

  • References

  • 1 Benbassat J, Polak BC, Javitt JC. Objectives of teaching direct ophthalmoscopy to medical students. Acta Ophthalmol 2012; 90 (06) 503-507
  • 2 Mottow-Lippa L. Ophthalmology in the medical school curriculum: reestablishing our value and effecting change. Ophthalmology 2009; 116 (07) 1235-1236 , 1236.e1
  • 3 Gupta RR, Lam WC. Medical students' self-confidence in performing direct ophthalmoscopy in clinical training. Can J Ophthalmol 2006; 41 (02) 169-174
  • 4 Lippa LM, Boker J, Duke A, Amin A. A novel 3-year longitudinal pilot study of medical students' acquisition and retention of screening eye examination skills. Ophthalmology 2006; 113 (01) 133-139
  • 5 Megbelayin EO, Asana EU, Nkanga GD. et al. Evaluation of competence of medical students in performing direct ophthalmoscopy. Niger J Ophthalmol 2014; 22 (02) 73-77
  • 6 Basilious A, Cheng J, Buys YM. Comparison of glaucoma knowledge and referral practices among family physicians with ophthalmologists' expectations. Can J Ophthalmol 2015; 50 (03) 202-208
  • 7 Mackay DD, Garza PS, Bruce BB, Newman NJ, Biousse V. The demise of direct ophthalmoscopy: a modern clinical challenge. Neurol Clin Pract 2015; 5 (02) 150-157
  • 8 Noble J, Somal K, Gill HS, Lam WC. An analysis of undergraduate ophthalmology training in Canada. Can J Ophthalmol 2009; 44 (05) 513-518
  • 9 Shah M, Knoch D, Waxman E. The state of ophthalmology medical student education in the United States and Canada, 2012 through 2013. Ophthalmology 2014; 121 (06) 1160-1163
  • 10 Fan JC, Sherwin T, McGhee CN. Teaching of ophthalmology in undergraduate curricula: a survey of Australasian and Asian medical schools. Clin Exp Ophthalmol 2007; 35 (04) 310-317
  • 11 Baylis O, Murray PI, Dayan M. Undergraduate ophthalmology education—a survey of UK medical schools. Med Teach 2011; 33 (06) 468-471
  • 12 McComiskie JE, Greer RM, Gole GA. Panoptic versus conventional ophthalmoscope. Clin Exp Ophthalmol 2004; 32 (03) 238-242
  • 13 Clarkson JG. Training in ophthalmology is critical for all physicians. Arch Ophthalmol 2003; 121 (09) 1327-1327
  • 14 Dalay S, Umar F, Saeed S. Fundoscopy: a reflection upon medical training?. Clin Teach 2013; 10 (02) 103-106
  • 15 Statista. Number of smartphone users worldwide from 2014 to 2020 (in billions). ; 2017. Accessed November 1, 2019 at: https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/
  • 16 Myung D, Jais A, He L, Blumenkranz MS, Chang RT. 3D printed smartphone indirect lens adapter for rapid, high quality retinal imaging. J Mob Technol Med 2014; 3 (01) 9-15
  • 17 Bastawrous A, Giardini ME, Jordan S. Peek: Portable Eye Examination Kit. The Smartphone Ophthalmoscope. Invest Ophthalmol Vis Sci 2014; 55 (13) 1612
  • 18 Russo A, Morescalchi F, Costagliola C, Delcassi L, Semeraro F. Comparison of smartphone ophthalmoscopy with slit-lamp biomicroscopy for grading diabetic retinopathy. Am J Ophthalmol 2015; 159 (02) 360-364.e1
  • 19 Russo A, Mapham W, Turano R. et al. Comparison of smartphone ophthalmoscopy with slit-lamp biomicroscopy for grading vertical cup-to-disc ratio. J Glaucoma 2016; 25 (09) e777-e781
  • 20 Shanmugam MP, Mishra DK, Madhukumar R, Ramanjulu R, Reddy SY, Rodrigues G. Fundus imaging with a mobile phone: a review of techniques. Indian J Ophthalmol 2014; 62 (09) 960-962
  • 21 Colicchia G, Wiesner H. Looking into the eye with a smartphone. Phys Teach 2015; 53 (02) 106-108
  • 22 Kulendran M, Lim M, Laws G. et al. Surgical smartphone applications across different platforms: their evolution, uses, and users. Surg Innov 2014; 21 (04) 427-440
  • 23 Dyaberi R, Bajantri YB, Khatib ZI, Hedge S, Khanna V. Smartphone indirect ophthalmoscopy: for screening, and documentation of the ocular fundus. J Vis 2015; 1 (01) 13
  • 24 Bastawrous A. Smartphone fundoscopy. Ophthalmology 2012; 119 (02) 432-433.e2 , author reply 433
  • 25 Dao D, Shah N, Tamhankar M. et al. Smartphone Ophthalmoscopy (D-EYE System) for Detection of Optic Nerve Pathology and Cup-to-Disk Ratio in an Outpatient Clinical Setting. Poster session presented at: The Association for Research in Vision and Ophthalmology 2017 Annual Meeting; May 7–11, 2017; Baltimore, MD
  • 26 Vilela MA, Valença FM, Barreto PK, Amaral CE, Pellanda LC. Agreement between retinal images obtained via smartphones and images obtained with retinal cameras or fundoscopic exams—systematic review and meta-analysis. Clin Ophthalmol 2018; 12: 2581-2589
  • 27 D-EYE Smartphone-Based Retinal Imaging System. Accessed November 1, 2019 at: https://www.D-EYEcare.com/en_US/product#features
  • 28 Mamtora S, Sandinha MT, Ajith A, Song A, Steel DHW. Smart phone ophthalmoscopy: a potential replacement for the direct ophthalmoscope. Eye (Lond) 2018; 32 (11) 1766-1771
  • 29 Kim Y, Chao DL. Comparison of smartphone ophthalmoscopy vs conventional direct ophthalmoscopy as a teaching tool for medical students: the COSMOS study. Clin Ophthalmol 2019; 13: 391-401
  • 30 Wu AR, Fouzdar-Jain S, Suh DW. Comparison study of funduscopic examination using a smartphone-based digital ophthalmoscope and the direct ophthalmoscope. J Pediatr Ophthalmol Strabismus 2018; 55 (03) 201-206
  • 31 Russo A, Morescalchi F, Costagliola C, Delcassi L, Semeraro F. A novel device to exploit the smartphone camera for fundus photography. J Ophthalmol 2015; 2015: 823139
  • 32 Bénard-Séguin É, Kwok J, Liao W, Baxter S. Use of a fundus photograph matching program in imparting proficiency in ophthalmoscopy. Can J Ophthalmol 2018; 53 (05) 480-485
  • 33 Kwok J, Liao W, Baxter S. Evaluation of an online peer fundus photograph matching program in teaching direct ophthalmoscopy to medical students. Can J Ophthalmol 2017; 52 (05) 441-446

Zoom Image
Fig. 1 Data collection form 2018 study cohort. The data collection form for the 2019 study cohort was identical with the exception of an added abnormal/normal disk section. Students were asked to choose from one of four options: unable to assess/did not see disk, unsure, normal or abnormal in response to the prompt “was this an overall normal or abnormal disk.”
Zoom Image
Fig. 2 Study flow diagram. *Each medical student participant was randomized to start in the first room using either the D-EYE or DO, and then moved through subsequent rooms in an OSCE style progression designed to alternate left eye/right eye dilation and device used. In each room the medical student examined both eyes of the patient. DO, direct ophthalmoscope; OSCE, objective structured clinical examination.