J Am Acad Audiol 2020; 31(01): 030-039
DOI: 10.3766/jaaa.18049
Articles
Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

Visual Reliance During Speech Recognition in Cochlear Implant Users and Candidates

Aaron C. Moberly
*   Department of Otolaryngology – Head & Neck Surgery, The Ohio State University, Columbus, OH
,
Kara J. Vasil
*   Department of Otolaryngology – Head & Neck Surgery, The Ohio State University, Columbus, OH
,
Christin Ray
*   Department of Otolaryngology – Head & Neck Surgery, The Ohio State University, Columbus, OH
› Author Affiliations
Further Information

Publication History

Publication Date:
25 May 2020 (online)

Abstract

Background:

Adults with cochlear implants (CIs) are believed to rely more heavily on visual cues during speech recognition tasks than their normal-hearing peers. However, the relationship between auditory and visual reliance during audiovisual (AV) speech recognition is unclear and may depend on an individual’s auditory proficiency, duration of hearing loss (HL), age, and other factors.

Purpose:

The primary purpose of this study was to examine whether visual reliance during AV speech recognition depends on auditory function for adult CI candidates (CICs) and adult experienced CI users (ECIs).

Study Sample:

Participants included 44 ECIs and 23 CICs. All participants were postlingually deafened and had met clinical candidacy requirements for cochlear implantation.

Data Collection and Analysis:

Participants completed City University of New York sentence recognition testing. Three separate lists of twelve sentences each were presented: the first in the auditory-only (A-only) condition, the second in the visual-only (V-only) condition, and the third in combined AV fashion. Each participant’s amount of “visual enhancement” (VE) and “auditory enhancement” (AE) were computed (i.e., the benefit to AV speech recognition of adding visual or auditory information, respectively, relative to what could potentially be gained). The relative reliance of VE versus AE was also computed as a VE/AE ratio.

Results:

VE/AE ratio was predicted inversely by A-only performance. Visual reliance was not significantly different between ECIs and CICs. Duration of HL and age did not account for additional variance in the VE/AE ratio.

Conclusions:

A shift toward visual reliance may be driven by poor auditory performance in ECIs and CICs. The restoration of auditory input through a CI does not necessarily facilitate a shift back toward auditory reliance. Findings suggest that individual listeners with HL may rely on both auditory and visual information during AV speech recognition, to varying degrees based on their own performance and experience, to optimize communication performance in real-world listening situations.

This work was supported by the American Otological Society Clinician-Scientist Award and the National Institutes of Health and National Institute on Deafness and Other Communication Disorders (NIDCD) Career Development Award 5K23DC015539-02 to Aaron Moberly. Research reported in this paper received IRB approval from The Ohio State University.


A.C.M. receives grant funding support from Cochlear Americas for an unrelated investigator-initiated research study.


Data from this manuscript were presented at the AAA 2018 annual conference of the American Academy of Audiology, Nashville, TN, April 18–21, 2018.


 
  • REFERENCES

  • Altieri NA, Pisoni DB, Townsend JT. 2011; Some normative data on lip-reading skills. J Acoust Soc Am 130 (01) 1-4
  • Boothroyd A, Hanin L, Hnath T. 1985. a CUNY Laser Videodisk of Everyday Sentences. New York, NY: Speech and Hearing Sciences Research Center; City University of New York;
  • Boothroyd A, Hanin L, Hnath T. 1985. b A sentence Test of Speech Perception: Reliability, Set Equivalence, and Short-Term Learning. Internal Report RCI 10. New York, NY: Speech and Hearing Sciences Research Center; City University of New York;
  • Desai S, Stickney G, Zeng FG. 2008; Auditory-visual speech perception in normal-hearing and cochlear-implant listeners. J Acoust Soc Am 123 (01) 428-440
  • Dorman MF, Liss J, Wang S, Berisha V, Ludwig C, Natale SC. 2016; Experiments on auditory-visual perception of sentences by users of unilateral, bimodal, and bilateral cochlear implants. J Speech Lang Hear Res 59 (06) 1505-1519
  • Folstein MF, Folstein SE, McHugh PR. 1975; “Mini-mental state”: a practical method for grading the cognitive state of patients for the clinician. J Psych Res 12 (03) 189-198
  • Giraud AL, Lee HJ. 2007; Predicting cochlear implant outcome from brain organisation in the deaf. Restor Neurol Neurosci 25 (3–4) 381-390
  • Giraud AL, Truy E, Frackowiak R. 2001; Imaging plasticity in cochlear implant patients. Audiol Neurotol 6 (06) 381-393
  • Goh WD, Pisoni DB, Kirk KI, Remez RE. 2001; Audio-visual perception of sinewave speech in an adult cochlear implant user: a case study. Ear Hear 22 (05) 412
  • Grant KW, Seitz PF. 1998; Measures of auditory–visual integration in nonsense syllables and sentences. J Acoust Soc Am 104 (04) 2438-2450
  • Grant KW, Walden BE, Seitz PF. 1998; Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integration. J Acoust Soc Am 103: 2677-2690
  • Hay-McCutcheon MJ, Pisoni DB, Kirk KI. 2005; Audiovisual speech perception in elderly cochlear implant recipients. Laryngoscope 115 (10) 1887-1894
  • Kaiser AR, Kirk KI, Lachs L, Pisoni DB. 2003; Talker and lexical effects on audiovisual word recognition by adults with cochlear implants. J Speech Lang Hear Res 46 (02) 390-404
  • Leybaert J, LaSasso CJ. 2010; Cued speech for enhancing speech perception and first language development of children with cochlear implants. Trends Amplif 14 (02) 96-112
  • Moradi S, Lidestam B, Rönnberg J. 2016; Comparison of gated audiovisual speech identification in elderly hearing aid users and elderly normal-hearing individuals: effects of adding visual cues to auditory speech stimuli. Trends Hear 20: 2331216516653355
  • Nilsson M, Soli SD, Sullivan JA. 1994; Development of the Hearing in Noise Test for the measurement of speech reception thresholds in quiet and in noise. J Acoust Soc Am 95 (02) 1085-1099
  • Nittrouer S, Burton LT. 2005; The role of early language experience in the development of speech perception and phonological processing abilities: evidence from 5-year-olds with histories of otitis media with effusion and low socioeconomic status. J Commun Dis 38 (01) 29-63
  • Rabinowitz WM, Eddington DK, Delhorne LA, Cuneo PA. 1992; Relations among different measures of speech reception in subjects using a cochlear implant. J Acoust Soc Am 92 (04) 1869-1881
  • Rouger J, Fraysse B, Deguine O, Barone P. 2008; McGurk effects in cochlear-implanted deaf subjects. Brain Res 1188: 87-99
  • Rouger J, Lagleyre S, Fraysse B, Deneve S, Deguine O, Barone P. 2007; Evidence that cochlear-implanted deaf patients are better multisensory integrators. Proc Natl Acad Sci USA 104 (17) 7295-7300
  • Schreitmüller S, Frenken M, Bentz L, Ortmann M, Walger M, Meister H. 2018; Validating a method to assess lipreading, audiovisual gain, and integration during speech reception with cochlear-implanted and normal-hearing subjects using a talking head. Ear Hear 39 (03) 503-516
  • Sommers MS, Tye-Murray N, Spehar B. 2005; Auditory-visual speech perception and auditory-visual enhancement in normal-hearing younger and older adults. Ear Hear 26 (03) 263-275
  • Spahr AJ, Dorman MF, Litvak LM, Van Wie S, Gifford RH, Loizou PC, Loiselle LM, Oakes T, Cook S. 2012; Development and validation of the AzBio sentence lists. Ear Hear 33 (01) 112
  • Stevenson RA, Sheffield SW, Butera IM, Gifford RH, Wallace MT. 2017; Multisensory integration in cochlear implant recipients. Ear Hear 38 (05) 521-538
  • Strelnikov K, Rouger J, Barone P, Deguine O. 2009; Role of speechreading in audiovisual interactions during the recovery of speech comprehension in deaf adults with cochlear implants. Scand J Psychol 50 (05) 437-444
  • Strelnikov K, Rouger J, Demonet JF, Lagleyre S, Fraysse B, Deguine O, Barone P. 2013; Visual activity predicts auditory recovery from deafness after adult cochlear implantation. Brain 136 (12) 3682-3695
  • Sumby WH, Pollack I. 1954; Visual contributions to speech intelligibility in noise. J Acoust Soc Am 26: 212-215
  • Tremblay C, Champoux F, Lepore F, Théoret H. 2010; Audiovisual fusion and cochlear implant proficiency. Restor Neurol Neurosci 28 (02) 283-291
  • Tye-Murray N, Sommers MS, Spehar B. 2007; Audiovisual integration and lipreading abilities of older adults with normal and impaired hearing. Ear Hear 28 (05) 656-668
  • Tye-Murray N, Sommers M, Spehar B, Myerson J, Hale S. 2010; Aging, audiovisual integration, and the principle of inverse effectiveness. Ear Hear 31 (05) 636
  • Wilkinson GS, Robertson GJ. 2006. Wide range achievement test (WRAT4). Lutz, FL: Psychological Assessment Resources;