A Close Link Between "Hearing" and "Speaking"

Investigating the Mechanisms of Speech Perception

Spoken language is a fundamental means of human communication. It is normally an effortless process for us to understand spoken sentences. However, the mechanisms of brain functions underlying speech perception remain unclear. We are trying to reveal the mechanisms incorporating diverse methods such as speech motor analysis, psychophysics, and functional brain imaging. Our findings suggest that, in terms of brain functions, a listener refers to own articulatory gestures (e.g., movements of tongue and lips) in the perception of speech signals.

Future Implications

As automatic speech recognition technologies advance, the challenges are becoming more specific: Machines are far less flexible to speaker variability and environmental factors than human listeners are. A better understanding of the human mechanisms of speech perception may provide clues to improve the technologies.
The knowledge will also contribute to other areas of research such as foreign language acquisition and communication disorders (e.g., aphasia).
As well as the phonetic aspects, our research interests include the brain mechanisms of understanding emotions and nuance in speech signals.

Auditory Illusion and Brain

When we hear a word repeatedly, it causes an auditory illusion. A string of repeated "banana" may transform into "banana???hana???nappa???banana." Using functional magnetic resonance imaging (fMRI), we analyzed brain activities at the timing of the perceptual changes.

The activated areas included Broca’s area and the insular cortex in the left hemisphere. These areas are associated with speech production and speech motor control. Broca’s area activation was stronger in the listeners who reported more frequent word form transformation.

These findings suggest that the brain functions for speech production are also involved in speech perception.

Vowel Perception and Production

We asked experiment participants to pronounce vowel sequences to analyze the relationship between their articulatory gestures and produced acoustic signals. The same amount of articulatory movement may produce the different amount of acoustical changes depending on phonetic context.

Next, we tested the participants’ vowel perception. The participants had difficulties in discriminating vowels corresponding to small articulatory movements.

These findings suggest that vowel perception reflects the characteristics of articulatory gestures.

Related Research