CC BY-NC-ND 4.0 · Journal of Academic Ophthalmology 2019; 11(02): e24-e29
DOI: 10.1055/s-0039-1694041
Research Article
Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

A Pilot Study on the Effects of Physician Gaze on Patient Satisfaction in the Setting of Electronic Health Records

Michael T. Ou
1   Johns Hopkins University School of Medicine, Baltimore, Maryland
,
Hannah Kleiman
2   Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine, Baltimore, Maryland
,
Sachin Kalarn
2   Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine, Baltimore, Maryland
,
Ahmadreza Moradi
3   Department of Medicine, Icahn School of Medicine at Mount Sinai, New York
,
Shweta Shukla
2   Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine, Baltimore, Maryland
,
Madalyn Danielson
2   Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine, Baltimore, Maryland
,
Mona Kaleem
2   Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine, Baltimore, Maryland
,
Michael Boland
4   Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, Maryland
,
Alan L. Robin
4   Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, Maryland
5   Department of Ophthalmology and Visual Sciences, Kellogg Institute, University of Michigan, Ann Arbor, Michigan
6   Deparment of International Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland
,
Osamah J. Saeedi
2   Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine, Baltimore, Maryland
› Author Affiliations
Further Information

Address for correspondence

Osamah Saeedi, MD, MS
Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine
419 West Redwood Street, Suite 470, Baltimore, MD 21201

Publication History

17 December 2018

24 June 2019

Publication Date:
13 August 2019 (online)

 

Abstract

This study aims to determine the amount of time ophthalmologists using electronic health records (EHRs) spend looking at the patient and its correlation on patient satisfaction. This prospective cohort study examined 67 patients seeking care at two different ophthalmology clinics. Videos of entire office visits were recorded and each video was graded for amount of time spent by physicians gazing at the patient, computer, paper medical records, or other areas. Videos were also graded for the amount of time examining the patient, and the physician speaking during each visit. A patient satisfaction survey was administered at the end of each office encounter. Time of physician gaze to the patient was correlated to satisfaction outcome measures. Ophthalmologists spent 28.0 ± 21.2% of the visit looking at the computer. Overall, patient satisfaction levels were very high (4.8 ± 0.5, five-point Likert's scale). Ophthalmologists spent the same amount of time looking at patients who were extremely satisfied (28.8 ± 16.7%), as those who were not extremely satisfied (28.8 ± 15.9%). Ophthalmologists on EHRs spend over one-third of the time of each patient visit looking at the computer. However, patient satisfaction levels are very high. The amount of time that the ophthalmologist gazes at the patient or the computer does not appear to have an effect on patient satisfaction in this particular study. Further research still needs to be performed regarding the effects of EHRs on the patient experience. Physicians should continue to be sensitive to their patients' needs and approach the use of EHRs in patient encounters on an individual basis.


#

Perhaps one of the oldest and most fundamental aspects of care is the patient–provider relationship. In fact, the quality of patient–provider relationship may be directly related to the health of patients.[1] With increasing demands on providers, the ability to form and maintain such bonds may be affected.

Communication, comprised of both verbal and nonverbal components, is fundamental to this relationship, as strong communication skills may be strongly associated with perceptions of medical competence.[2] Within nonverbal communication, clinician gaze may be a significant predictor of patient satisfaction, especially amongst female physicians.[3]

Electronic systems may disrupt this patient–provider relationship.[4] The progressive use of technology has changed the landscape of health care, with one of the most widely felt implementations in the form of electronic health records (EHRs). Incentives and mandates within the passage of the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009,[5] [6] have led to the adoption of EHRs as an essential technology for most U.S. medical practices, including ophthalmology practices.[7] While this technology has the potential to increase efficiency and decrease the cost of medical documentation and patient information retrieval,[6] use of EHR has been shown to affect the visual, verbal, and postural connection between internal medicine clinicians and patients.[8] It has also been shown that patients believed exam room computers decreased the amount of interpersonal contact with internal medicine physicians.[9]

However, ophthalmology specific investigations are necessary as these clinicians may use unique ophthalmology specific EHR content, such as fundus drawings, which other fields of medicine may not. A 2018 survey of the American Academy of Ophthalmology members found that 72% of practices surveyed had implemented EHRs, but found that many ophthalmologists perceived a decrease in patients seen per day and a need for increased EHR usability.[7] These perceptions are warranted as several studies on the efficiency of EHR use for ophthalmologists show that EHR documentation is slower than paper charting.[10] [11] [12] However, the impact of EHR on the patient–provider relationships needs further investigation. Given the uniqueness of practices in Ophthalmology, the aim of this study is to determine the amount of time ophthalmologists spend looking at the patient and to determine if this had any effect on patient satisfaction.

Methods

This prospective study was reviewed and approved by the University of Maryland Institutional Review Board and adheres to the Declaration of Helsinki as well as the Health Insurance and Portability and Accountability Act (HIPAA). A total of 67 adult patient encounters were recorded amongst 67 different patients visiting seven different ophthalmologists. Of these 67 patients, 49 patients were seen at the University of Maryland Department of Ophthalmology, in an out-patient setting between six ophthalmologists who used EHR for documentation (Epic Systems, Verona, WI), while 18 patients were seen by an unaffiliated glaucoma specialist in an out-patient, private practice setting that only using paper charting. Ophthalmologists using EHR still viewed some test on paper, such as visual fields testing results, while the ophthalmologist on paper charts had access to a computer in the room for reference and patient education. Videos were collected over a 3-month time period. Each examination room had similar settings regarding location of the EHR, physician placement, and patient seating.

Among the seven ophthalmologists, four subspecialized in glaucoma, and each one among the other three subspecialized in retina, neuroophthalmology, and oculoplastics, respectively. Participating ophthalmologists who consented to have their patient encounters recorded were informed that the study was to evaluate EHR use but were not told specifically how their behavior was being assessed. After the completion of data collection, all physicians were debriefed with detailed study goals and results.

Patients were identified prior to the physician–patient encounter and must have been seen previously by a physician in the practice to reduce intervisit differences. Eligible patients were fully debriefed on the study and enrolled if they provided informed consent. Videotaped patient encounters were excluded if they were scheduled for a procedure or were having a postoperative examination. Patients who withdrew, for whom there was an incomplete video recording, or for whom the treating physician did not consent to recording, were also excluded.

Video Recording and Analysis

Patient encounters were recorded using a GoPro Hero 3 (GoPro Inc., San Mateo, CA). The camera was placed in an inconspicuous location; when possible, it was placed toward the right-hand side of the patient, as far behind the patient chair as possible with still a clear view of both the patient and physician. An encounter consisted of the entire time the physician and patient were both in the examination room. Time that the physician or patient left the room was not counted in the total time of the encounter. A research team member was present during the entirety of each recorded encounter and was able to stop recording at the request of the physician or patient. Recording was paused if either the physician or patient left the room and the elapsed time excluded from analysis.

A methodology of grading was adapted from Montague and Asan.[13] Each encounter recording was graded for five different physicians' behaviors, based on where each physician was looking at each second of the encounter. These categories were labeled as “computer,” “patient,” “chart,” “other,” and “physician examining patient,” with each being mutually exclusive. “Physician examining patient” was given its own category because we deemed it to be a significantly different action than simply looking at the patient. This category referred to any time the physician was doing an exam or putting in eye drops. The behavior “physician gaze other” included the physician gazing at anything besides the patient, computer, or paper charts. A sixth behavior, “physician talking” was also graded and could occur concurrently with any of the five previous behaviors. “Physician talking” referred to any time the physician was speaking, with the exception of affirmation words, such as “yes,” “okay,” etc., not associated with any other speech. A similar methodology of grading was used by Montague et al.[13]

Three research team members were trained to grade videos second by second, using a two-pass system. The first pass consisted of grading only the physician's point of gaze, whether it be at the computer, patient, paper charts, during an examination, or none of the above. During the second pass, the graders coded only for the physician speaking. Each grader was trained and certified by the primary investigator after grading a standard set of videos. Interrater reliability was determined and Cronbach's α was determined to be greater than 0.95 for each grader.


#

Satisfaction Survey

Patients completed a satisfaction survey immediately following their recorded encounter with their ophthalmologists. They completed the surveys in the same private room as the clinical encounter, after the physician had left the room. The 10 questions listed in [Table 1] were used in a prior study on patient satisfaction after implementation of EHRs in a glaucoma practice.[14] Patients were asked to rate each statement on a five-point Likert's scale as follows: (1) strongly disagree, (2) disagree, (3) neutral, (4) agree, and (5) strongly agree. Of these questions, four were adapted from a validated survey.

Table 1

Survey Questions[13]

Questions

Mean ± SD

Q1: I am satisfied with the quality of care I received today.

4.76 ± 0.46

Q2: My clinic visit was handled efficiently and smoothly.

4.52 ± 0.77

Q3: I can talk to my doctor easily when he or she uses a computer.

4.52 ± 0.71

Q4: My doctor is able to maintain good personal contact with me while using the computer.

4.56 ± 0.61

Q5: My doctor seems comfortable with using the electronic medical record system.

4.58 ± 0.64

Q6: My doctor directly mentioned the computer of electronic medical record during our conversation.

4.18 ± 1.06

Q7: My visits are more efficient because the doctor uses an electronic medical record system.

4.32 ± 0.87

Q8: I am comfortable with the idea of my doctor using a computer to track information about me.

4.57 ± 0.75

Q9: Seeing my medical information in the form of charts or graphs would help me better understand my medical issues.

4.19 ± 1.09

Q10: I prefer electronic medical record to paper charts.

3.73 ± 1.21

Abbreviation: SD, standard deviation.


Note: survey choices: 1, strongly disagree; 2, disagree; 3, neural; 4, agree; 5, strongly agree.



#

Statistical Analysis

For each question, utilizing a Likert's scale (1–5 scale), a mean and median Likert's scores were calculated. Due to high-satisfaction levels, “not extremely satisfied” was classified as not strongly agreeing (≤ 4 on the five-point scale), while “extremely satisfied” was classified as strongly agreeing (5 on the five-point scale). Level of satisfaction was based on question 1 of the survey.

Different characteristics were compared among patients with mean satisfaction score > 4 and patients with satisfaction score ≤ 4 ([Table 2]), as well as the clinic using EHRs versus the clinic using only paper charts. Numerical and categorical variables were compared among subgroups and p-values were calculated using t-test and Chi-squared test, respectively. All p-values were nominal. All statistical analyses were performed using Stata software Version 14.0 (Stata Corp 2014 Stata Statistical Software; StataCorp LP, College Station, TX).

Table 2

Patient satisfaction

Characteristics

Extremely satisfied (n = 52)

Not extremely satisfied (n = 15)

p-Value

Percentage physician gaze computer (%)

25.8 ± 21.2

35.65 ± 19.93

0.11

Percentage physician gaze patient (%)

28.8 ± 16.7

28.8 ± 15.9

0.99

Percentage physician gaze chart (%)

13.4 ± 12.2

8.3 ± 9.9

0.14

Percentage physician gaze other (%)

13.9 ± 6.8

11.11 ± 3.71

0.13

Percentage physician examining patient (%)

19.2 ± 11.6

16.3 ± 8.6

0.38

Percentage physician talking (%)

59.2 ± 15.2

57.5 ± 14.4

0.69

Note: values are expressed as mean ± standard deviation. No p-values were significant (p < 0.05).



#
#

Results

A total of 80 patients were consented for this study. Seven visits that were either postoperative or procedure visits were excluded. One patient withdrew from the study, three patients were excluded as their visits were only partially recorded due to technical issues, and two patients who consented were not recorded at the request of the physician. The average patient age was 62 years old with 47.8% male and 52.2% female ([Table 3]). Of the ophthalmologists, five were female and two were male. There was one provider each of Caucasian, African American, South Asian, East Asian, and European background and two providers were from Middle East background.

Table 3

Baseline characteristics of participants

Age in y (mean ± SD)

61.83 ± 16.46

Race, n (%)

 White

31 (46.3)

 Black

35 (52.2)

 Other

1 (1.5)

Sex, n (%)

 Men

32 (47.8)

 Women

35 (52.2)

EHR, n (%)

50 (74.6)

Encounters with physician subspecialty, n (%)

 Encounters with oculoplastics specialist (one specialist)

14 (20.9)

 Encounters with vitreous and retinal diseases specialist (one specialist)

11 (16.4)

 Encounters with glaucoma specialists (four specialists)

36 (53.7)

 Encounters with neuroophthalmology specialist (one specialist)

6 (9)

 Total encounters

67

Abbreviations: HER, electronic health records; SD, standard deviation.


[Table 4] outlines the total time in minutes as well as the percentage of time for physician gaze. The mean encounter was 16.4 ± 8.3 minutes of which 28% on average was spent looking at the computer, and 29% at the patient. About 18.5% of each visit consisted of time during which the physician examined the patient. On average, the physician spoke to the patient 58.8% of the encounter.

Table 4

Physician gaze

Category

Mean total time ± SD (min)

Mean percentage (time ± SD)

Total encounter

16.4 ± 8.3

N/A

Physician gaze computer

4.8 ± 5.3

28.0 ± 21.2

Physician gaze patient

4.7 ± 3.5

28.8 ± 16.4

Physician gaze chart

2.0 ± 2.4

12.6 ± 11.9

Physician gaze other

2.1 ± 1.5

13.3 ± 6.3

Physician examining patient

2.4 ± 1.9

18.5 ± 11.0

Physician talking

9.2 ± 4.5

58.8 ± 14.9

Abbreviations: N/A, not available; SD, standard deviation.


Patients were generally highly satisfied with the quality of care they received, giving an average score of 4.8 ± 0.5 on a five-point scale ([Table 1]). There were no statistical differences in satisfaction with respect to age, race, or sex. Furthermore, there were no statistical differences in satisfaction based on the various categories of physician gaze ([Table 2]).


#

Discussion

This is the first study, to our knowledge, to quantify physician's gaze using detailed video recording and then comparing the effects of physician's gaze on patient satisfaction in an ophthalmology practice. We found that physicians spent roughly equal amounts of time looking at the computer as looking at the patient, each accounting for almost one-third of the clinic visit. However, patient-satisfaction levels were high despite physicians gazing at the computer screen for a large portion of each visit. Differences in the various categories of physician gaze did not have a significant impact on patient satisfaction.

We initially hypothesized that more percentage of time physicians spend looking directly at the patient would in turn lead to higher levels of patient satisfaction. Previous studies showed that an increase in screen gazing was inversely related to emotional responses and socioemotional exchange, possibly diminishing patient-centered practice.[15] However, our study found that patients had high levels of satisfaction despite physicians spending the largest portion of their time looking toward the computer screen. The relatively high level of satisfaction measured using this survey instrument was the same as in the prior study conducted by Pandit and Boland. Their team found no change in patients' perspectives of quality and efficiency before and after EHR implementation at 2 weeks and 6 months. In fact, many patients even agreed that EHRs made their visit more efficient.[14]

The percentage of time of physicians gaze toward the patient and the EHR were also similar to a previous study of primary care providers conducted by Montague et al. Physicians in their study spent 30.7% of the visit looking at the EHR and 46.5% looking at the patient.[13] Ophthalmologists in our study spent a comparable amount of time looking at EHRs (27.5%) and looking at the patient, when combined with time spent examining the patient (47.3%). The increase in time providers spent looking at EHRs, as per the study by Montague et al, was found to take away from the time looking toward the patient, which they hypothesized would be associated with decreased patient satisfaction. However, it was also proposed EHRs could provide an opportunity to better engage patients in shared interactions and open avenues for the mutual exchange of information.[13]

Because of the high levels of satisfaction received at baseline, we sought to differentiate between utmost satisfaction levels and moderately-high satisfaction levels post hoc. Thus, it was necessary to stratify between participants who were extremely satisfied (5, Likert's scale) and those not extremely satisfied (≤ 4, Likert's scale). We logically assumed that patients who had any hesitation about their clinical visit would not have chosen the highest satisfaction rating. However, even when categorized between extremely satisfied and not extremely satisfied, our results showed that physician gazing at the computer, patient, and time spent talking were not significantly different.

Several reasons may explain why physicians gaze may not have an impact on patient satisfaction. Patients in ophthalmology clinics often have visual impairments that decrease the value of direct face to face contact. Verbal communication may play a larger role in effective communication and intentions, feelings, and attitudes may all be discerned from semantic and phonetic linguistics.[16] Our study showed that physicians spent more than half (58.8 ± 14.9%) of clinic visits speaking to the patient, an action that can be concurrent with any type of physician gaze. In ophthalmology clinics, considerable verbal communication may compensate for the time physicians gaze at the computer, especially verbal communication, while looking at the computer. As such our results are particularly important to the ophthalmic community as studies in other areas of medicine such as primary care may not be applicable. Further research on the single largest factor that determines patient satisfaction levels would be worthy and interesting.

We only included established patients of the practice and excluded new patients. The majority of patients recorded were previously seen by the same physician. Previous patient–physician encounters and their relationship may also have skewed patient-satisfaction levels toward higher satisfaction levels as patients naturally tend to seek care from physicians they trust.

Patients may also expect EHR use and perceive EHRs as an indication of the quality of care received.[16] Technological advances within the past two decades have created a tendency toward digitalization within multiple industries, including medicine. Patients may have come to expect the use of technology within health care and previous studies have shown a generally positive perception of EHR use.[16] Our study found that the majority of patients were comfortable with their physician using EHR (4.6 ± 0.8, Likert's scale), believed that their visit was more effect because of EHRs (4.3 ± 0.9, Likert's scale), and understood their medical conditions more by seeing medical information in charts and graphs (4.2 ± 1.0, Likert's scale).

Strengths of this study are its prospective assessment of patient satisfaction and a reproducible methodology to determine physician gaze in each interaction. There were also some notable limitations. As with any prospective study, sampling bias may have played a role in the high satisfaction scores obtained. That is to say, those who are more trusting and supportive of providers may have a higher likelihood of participating in the study. Furthermore, some of the findings may be specific to practitioners or individual practice and patient population. In particular, glaucoma providers composed a significant proportion of physicians included in the study. Our limited sample size may also reduce the generalizability of our findings as well as our use of providers from different institutions. Physicians were not informed of video assessment procedures or study goals until the conclusion of the study. However, Hawthorne effects may have altered physicians' behavior, regardless of knowledge of study goals. Clinics using other types of EHRs may produce differing results as no two clinics or EHR systems are alike, and we also did not account for the time patients spent waiting to be seen, a factor that may contribute to patient satisfaction. Prior reviewing of the patient chart either in the room or outside may also alter patient interactions as well as patient perspectives. It would be worth investigating the effects of previous or in-person chart review on patients' satisfaction in future studies.

Ultimately our work suggests that the transition to EHRs in an ophthalmology practice may change the physician–patient interaction but may not affect patient satisfaction. Further research still needs to be performed regarding the effects of EHRs on the patient experience. Physicians should continue to be sensitive to their patients' needs and approach and the use of EHRs in patient encounters on an individual basis.


#
#

Conflict of Interest

M.B. reports personal fees from Heidelberg Engineering, outside the submitted work.

Acknowledgments

The authors would like to thank Dr. Matthew Shulman for his help and suggestions. Dr. Saeedi is funded through an NIH Career Development Award K23EY025014.

  • References

  • 1 Beach MC, Keruly J, Moore RD. Is the quality of the patient-provider relationship associated with better adherence and health outcomes for patients with HIV?. J Gen Intern Med 2006; 21 (06) 661-665
  • 2 Buller MK, Buller DB. Physicians' communication style and patient satisfaction. J Health Soc Behav 1987; 28 (04) 375-388
  • 3 Mast MS, Hall JA, Köckner C, Choi E. Physician gender affects how physician nonverbal behavior is related to patient satisfaction. Med Care 2008; 46 (12) 1212-1218
  • 4 Noordman J, Verhaak P, van Beljouw I, van Dulmen S. Consulting room computers and their effect on general practitioner-patient communication. Fam Pract 2010; 27 (06) 644-651
  • 5 Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N Engl J Med 2010; 363 (06) 501-504
  • 6 Buntin MB, Jain SH, Blumenthal D. Health information technology: laying the infrastructure for national health reform. Health Aff (Millwood) 2010; 29 (06) 1214-1219
  • 7 Lim MC, Boland MV, McCannel CA. , et al. Adoption of electronic health records and perceptions of financial and clinical outcomes among ophthalmologists in the united states. JAMA Ophthalmol 2018; 136 (02) 164-170
  • 8 Frankel R, Altschuler A, George S. , et al. Effects of exam-room computing on clinician-patient communication: a longitudinal qualitative study. J Gen Intern Med 2005; 20 (08) 677-682
  • 9 Rouf E, Whittle J, Lu N, Schwartz MD. Computers in the exam room: differences in physician-patient interaction may be due to physician experience. J Gen Intern Med 2007; 22 (01) 43-48
  • 10 Chiang MF, Read-Brown S, Tu DC. , et al. Evaluation of electronic health record implementation in ophthalmology at an academic medical center (an American Ophthalmological Society thesis). Trans Am Ophthalmol Soc 2013; 111: 70-92
  • 11 Chan P, Thyparampil PJ, Chiang MF. Accuracy and speed of electronic health record versus paper-based ophthalmic documentation strategies. Am J Ophthalmol 2013; 156 (01) 165-172.e2
  • 12 Redd TK, Read-Brown S, Choi D, Yackel TR, Tu DC, Chiang MF. Electronic health record impact on productivity and efficiency in an academic pediatric ophthalmology practice. J AAPOS 2014; 18 (06) 584-589
  • 13 Montague E, Asan O. Dynamic modeling of patient and physician eye gaze to understand the effects of electronic health records on doctor-patient communication and attention. Int J Med Inform 2014; 83 (03) 225-234
  • 14 Pandit RR, Boland MV. The impact of an electronic health record transition on a glaucoma subspecialty practice. Ophthalmology 2013; 120 (04) 753-760
  • 15 Margalit RS, Roter D, Dunevant MA, Larson S, Reis S. Electronic medical record use and physician-patient communication: an observational study of Israeli primary care encounters. Patient Educ Couns 2006; 61 (01) 134-141
  • 16 Lee WW, Alkureishi MA, Ukabiala O. , et al. Patient perceptions of electronic medical record use by faculty and resident physicians: a mixed methods study. J Gen Intern Med 2016; 31 (11) 1315-1322

Address for correspondence

Osamah Saeedi, MD, MS
Department of Ophthalmology and Visual Sciences, University of Maryland School of Medicine
419 West Redwood Street, Suite 470, Baltimore, MD 21201

  • References

  • 1 Beach MC, Keruly J, Moore RD. Is the quality of the patient-provider relationship associated with better adherence and health outcomes for patients with HIV?. J Gen Intern Med 2006; 21 (06) 661-665
  • 2 Buller MK, Buller DB. Physicians' communication style and patient satisfaction. J Health Soc Behav 1987; 28 (04) 375-388
  • 3 Mast MS, Hall JA, Köckner C, Choi E. Physician gender affects how physician nonverbal behavior is related to patient satisfaction. Med Care 2008; 46 (12) 1212-1218
  • 4 Noordman J, Verhaak P, van Beljouw I, van Dulmen S. Consulting room computers and their effect on general practitioner-patient communication. Fam Pract 2010; 27 (06) 644-651
  • 5 Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N Engl J Med 2010; 363 (06) 501-504
  • 6 Buntin MB, Jain SH, Blumenthal D. Health information technology: laying the infrastructure for national health reform. Health Aff (Millwood) 2010; 29 (06) 1214-1219
  • 7 Lim MC, Boland MV, McCannel CA. , et al. Adoption of electronic health records and perceptions of financial and clinical outcomes among ophthalmologists in the united states. JAMA Ophthalmol 2018; 136 (02) 164-170
  • 8 Frankel R, Altschuler A, George S. , et al. Effects of exam-room computing on clinician-patient communication: a longitudinal qualitative study. J Gen Intern Med 2005; 20 (08) 677-682
  • 9 Rouf E, Whittle J, Lu N, Schwartz MD. Computers in the exam room: differences in physician-patient interaction may be due to physician experience. J Gen Intern Med 2007; 22 (01) 43-48
  • 10 Chiang MF, Read-Brown S, Tu DC. , et al. Evaluation of electronic health record implementation in ophthalmology at an academic medical center (an American Ophthalmological Society thesis). Trans Am Ophthalmol Soc 2013; 111: 70-92
  • 11 Chan P, Thyparampil PJ, Chiang MF. Accuracy and speed of electronic health record versus paper-based ophthalmic documentation strategies. Am J Ophthalmol 2013; 156 (01) 165-172.e2
  • 12 Redd TK, Read-Brown S, Choi D, Yackel TR, Tu DC, Chiang MF. Electronic health record impact on productivity and efficiency in an academic pediatric ophthalmology practice. J AAPOS 2014; 18 (06) 584-589
  • 13 Montague E, Asan O. Dynamic modeling of patient and physician eye gaze to understand the effects of electronic health records on doctor-patient communication and attention. Int J Med Inform 2014; 83 (03) 225-234
  • 14 Pandit RR, Boland MV. The impact of an electronic health record transition on a glaucoma subspecialty practice. Ophthalmology 2013; 120 (04) 753-760
  • 15 Margalit RS, Roter D, Dunevant MA, Larson S, Reis S. Electronic medical record use and physician-patient communication: an observational study of Israeli primary care encounters. Patient Educ Couns 2006; 61 (01) 134-141
  • 16 Lee WW, Alkureishi MA, Ukabiala O. , et al. Patient perceptions of electronic medical record use by faculty and resident physicians: a mixed methods study. J Gen Intern Med 2016; 31 (11) 1315-1322