Ruth Litovsky, PhD – Slide of the Week

Litovsky Slide of the Week

Title: Cortical speech processing in post-lingually deaf adult cochlear implant users, as revealed by fNIRS

Legend: Results of group mean differences-related ROIs. (a) ROIs (blue ellipses) where there were higher activation levels in CI users than in NH listeners when listening to auditory speech stimuli. (b) Comparisons of the activation levels in the 3 ROIs between CI users (in blue) and NH listeners (in red). Error bars are SEM. (c): Pearson correlations between the activations levels in the three ROIs of CI users and their speech test scores. Brain activation levels (concentration changes of HbO) are in the unit of mMol.

Citation: Zhou X, Seghouane AK, Shah A, Innes-Brown H, Cross W, Litovsky R, McKay CM. (2018). Cortical Speech Processing in Postlingually Deaf Adult Cochlear Implant Users, as Revealed by Functional Near-Infrared Spectroscopy. Trends in Hearing, 22:2331216518786850.

Abstract: Study Motivation – To investigate the feasibility of using functional near infrared spectroscopy (fNIRS) to image cortical activity in cochlear implant (CI) users and normal hearing (NH) adults, using either visual-speech or auditory-speech. Results – Response to visual speech in the left superior temporal gyrus/sulcus in CI users were negatively correlated with auditory speech understanding. This result suggests that increased cross-modal activity in the auditory cortex is predictive of poor auditory speech understanding. In three other regions CI users showed significantly greater mean activation levels in response to auditory speech compared with NH listeners (Fig. 7). Cortical activation levels were significantly negatively correlated with CI users’ auditory speech understanding. Conclusion – fNIRS successfully revealed activation patterns in CI users associated with their auditory speech understanding.

About the Lab: Research in the Litovsky lab focuses on the ability of children and adults to hear in complex auditory environments (e.g., classrooms, restaurants, playgrounds, and “cocktail parties”). We are typically faced with the challenge of interpreting sounds as they reach the ears, learning to ignore echoes and other irrelevant, distracting signals. The brain has specialized circuits that compute sound location and separate important sounds from background noise. We study the contribution of binaural hearing and the limitations that are experienced by people with hearing loss. In particular, we focus on a unique population of people who are deaf and use cochlear implants. In addition to studying spatial hearing, such as sound localization skills and speech understanding in noise, we are interested in cognitive development and its implications for auditory processing. Another aspect of our research focuses on the effect that challenging listening situations has on listening effort, and the amount of cognitive load exerted during psychophysical testing. Finally, our reverse engineering approaches focus on possible ways in which bilateral cochlear implants can be improved by synchronizing the devices across the ears and utilizing novel strategies to stimulate electrodes in the cochlea that restore near-normal binaural hearing.

Slide of the Week Archives