Research in our lab is focused on the neurobiological basis for speech comprehension. Common themes include the interplay between acoustic and cognitive processing during speech comprehension, how listeners deal with reduced sensory detail (such as speech in a noisy restaurant, or due to hearing impairment), and the various linguistic and sensory cues that help listeners predict incoming signals.

Below are some of our ongoing research projects. For all of these we make use of behavioral measures and human brain imaging (including structural and functional MRI, optical imaging, and MEG/EEG).

Listening effort and the neural consequences of acoustic challenge

We frequently listen to speech that is acoustically degraded due to background noise, foreign accents, or as a result of hearing loss. In these situations, our brains must make sense of an acoustic signal that is less detailed, and thus less certain. How do our brains cope with this type of degraded sensory input? What are the long-term consequences of hearing impairment for neural organization?

One way we have studied the effect of hearing ability on the brain is to look at the structure and function of auditory brain regions in listeners over the age of 60, who frequently have some degree of hearing loss. We find that individual differences in hearing ability are correlated with both the pattern of brain activity during speech comprehension, and with the volume of gray matter in auditory cortex. These results suggest that hearing impairment is associated with both functional and structural brain changes, which may influence other aspects of language processing.

Additional studies are investigating the degree to which acoustic clarity of speech may affect our ability to remember what we have heard. Our prediction is that when speech is more difficult to hear, it will require increased reliance on cognitive processes, which will impact memory. However, this challenge may be reduced for speech that is very predictable (as might occur in a short story) or in listeners with high levels of cognitive resources.

Age-related changes in speech comprehension

Top: Brain regions that show an increased response for syntactically-complex spoken sentences in both young and older adults. Bottom: Regions in which this syntax-related activity differs as a function of age; note that older adults show increased activity in numerous regions of frontal and prefrontal cortex outside the core syntax network. (From Peelle et al., 2010, Cerebral Cortex)

Our brains undergo significant change over our lifetimes. How are we able to maintain high levels of functioning despite these changes? One of our research interests is in the neural systems supporting successful aging in the context of spoken language. Speech comprehension is a particularly interesting case because it involves changes to both sensory and cognitive systems.

What we find is that when listening to spoken sentences, older adults rely on many of the same brain regions as young adults. However, older adults also tend to use some additional regions not used by young adults, especially in frontal cortex. One of the goals of our ongoing work is to better specify the additional cognitive processes involved, and to determine whether these are supporting acoustic processing, linguistic processing, or some combination of the two.