J Am Acad Audiol 2021; 32(08): 521-527
DOI: 10.1055/s-0041-1731699
Research Article

Effects of the Configuration of Hearing Loss on Consonant Perception between Simulated Bimodal and Electric Acoustic Stimulation Hearing

Yang-Soo Yoon
1   Department of Communication Sciences and Disorders, Baylor University, Waco, Texas
,
George Whitaker
2   Division of Otolaryngology, Baylor Scott & White Medical Center, Temple, Texas
,
Yune S. Lee
3   Department of Speech, Language, and Hearing, School of Behavioral and Brain Sciences Callier Center for Communication Disorders, The University of Texas at Dallas, Richardson, Texas
› Author Affiliations

Abstract

Background Cochlear implant technology allows for acoustic and electric stimulations to be combined across ears (bimodal) and within the same ear (electric acoustic stimulation [EAS]). Mechanisms used to integrate speech acoustics may be different between the bimodal and EAS hearing, and the configurations of hearing loss might be an important factor for the integration. Thus, differentiating the effects of different configurations of hearing loss on bimodal or EAS benefit in speech perception (differences in performance with combined acoustic and electric stimulations from a better stimulation alone) is important.

Purpose Using acoustic simulation, we determined how consonant recognition was affected by different configurations of hearing loss in bimodal and EAS hearing.

Research Design A mixed design was used with one between-subject variable (simulated bimodal group vs. simulated EAS group) and one within-subject variable (acoustic stimulation alone, electric stimulation alone, and combined acoustic and electric stimulations).

Study Sample Twenty adult subjects (10 for each group) with normal hearing were recruited.

Data Collection and Analysis Consonant perception was unilaterally or bilaterally measured in quiet. For the acoustic stimulation, four different simulations of hearing loss were created by band-pass filtering consonants with a fixed lower cutoff frequency of 100 Hz and each of the four upper cutoff frequencies of 250, 500, 750, and 1,000 Hz. For the electric stimulation, an eight-channel noise vocoder was used to generate a typical spectral mismatch by using fixed input (200–7,000 Hz) and output (1,000–7,000 Hz) frequency ranges. The effects of simulated hearing loss on consonant recognition were compared between the two groups.

Results Significant bimodal and EAS benefits occurred regardless of the configurations of hearing loss and hearing technology (bimodal vs. EAS). Place information was better transmitted in EAS hearing than in bimodal hearing.

Conclusion These results suggest that configurations of hearing loss are not a significant factor for integrating consonant information between acoustic and electric stimulations. The results also suggest that mechanisms used to integrate consonant information may be similar between bimodal and EAS hearing.

Disclaimer

Any mention of a product, service, or procedure in the Journal of the American Academy of Audiology does not constitute an endorsement of the product, service, or procedure by the American Academy of Audiology.




Publication History

Received: 20 January 2021

Accepted: 20 May 2021

Article published online:
29 December 2021

© 2021. American Academy of Audiology. This article is published by Thieme.

Thieme Medical Publishers, Inc.
333 Seventh Avenue, 18th Floor, New York, NY 10001, USA

 
  • References

  • 1 Gifford RH, Dorman MF, McKarns SA, Spahr AJ. Combined electric and contralateral acoustic hearing: word and sentence recognition with bimodal hearing. J Speech Lang Hear Res 2007; 50 (04) 835-843
  • 2 Gantz BJ, Hansen MR, Turner CW, Oleson JJ, Reiss LA, Parkinson AJ. Hybrid 10 clinical trial: preliminary results. Audiol Neurotol 2009; 14 (Suppl. 01) 32-38
  • 3 Mok M, Grayden D, Dowell RC, Lawrence D. Speech perception for adults who use hearing aids in conjunction with cochlear implants in opposite ears. J Speech Lang Hear Res 2006; 49 (02) 338-351
  • 4 Ching TY, Incerti P, Hill M. Binaural benefits for adults who use hearing aids and cochlear implants in opposite ears. Ear Hear 2004; 25 (01) 9-21
  • 5 Sheffield BM, Zeng FG. The relative phonetic contributions of a cochlear implant and residual acoustic hearing to bimodal speech perception. J Acoust Soc Am 2012; 131 (01) 518-530
  • 6 Mok M, Galvin KL, Dowell RC, McKay CM. Speech perception benefit for children with a cochlear implant and a hearing aid in opposite ears and children with bilateral cochlear implants. Audiol Neurotol 2010; 15 (01) 44-56
  • 7 Trevino A, Coleman TP, Allen J. A dynamical point process model of auditory nerve spiking in response to complex sounds. J Comput Neurosci 2010; 29 (1-2): 193-201
  • 8 Gifford RH, Davis TJ, Sunderhaus LW. et al. Combined electric and acoustic stimulation with hearing preservation: effect of cochlear implant low-frequency cutoff on speech understanding and perceived listening difficulty. Ear Hear 2017; 38 (05) 539-553
  • 9 Fu QJ, Galvin III JJ, Wang X. Integration of acoustic and electric hearing is better in the same ear than across ears. Sci Rep 2017; 7 (01) 12500
  • 10 Skarzynski H, Lorens A. Partial deafness treatment. Cochlear Implants Int 2010; 11 (Suppl. 01) 29-41
  • 11 Karsten SA, Turner CW, Brown CJ, Jeon EK, Abbas PJ, Gantz BJ. Optimizing the combination of acoustic and electric hearing in the implanted ear. Ear Hear 2013; 34 (02) 142-150
  • 12 Greenwood DD. A cochlear frequency-position function for several species--29 years later. J Acoust Soc Am 1990; 87 (06) 2592-2605
  • 13 Roup CM. Dichotic word recognition in noise and the right-ear advantage. J Speech Lang Hear Res 2011; 54 (01) 292-297
  • 14 Cox RM. Evidence-based practice in provision of amplification. Journal of the American Academy of Audiology 2005; 16 (07) 419-438
  • 15 Cohen J. Statistical Power Analysis for the Behavioral Sciences. 2nd ed.. Hillsdale, NJ: L. Erlbaum Associates; 1988
  • 16 Wang MD, Bilger RC. Consonant confusions in noise: a study of perceptual features. J Acoust Soc Am 1973; 54 (05) 1248-1266
  • 17 Yang HI, Zeng FG. Reduced acoustic and electric integration in concurrent-vowel recognition. Sci Rep 2013; 3: 1419
  • 18 Kong YY, Braida LD. Cross-frequency integration for consonant and vowel identification in bimodal hearing. J. Speech Lang. Hear. Res 2011; 54 (03) 959-980
  • 19 Yoon YS, Shin YR, Fu QJ. Clinical selection criteria for a second cochlear implant for bimodal listeners. Otol Neurotol 2012; 33 (07) 1161-1168
  • 20 Kong YY, Mullangi A. Using a vocoder-based frequency-lowering method and spectral enhancement to improve place-of-articulation perception for hearing-impaired listeners. Ear Hear 2013; 34 (03) 300-312