If your ears can't hear what someone is saying, try listening with your skin.
Sensations on the skin can help people understand speech, according to a study in the journal Nature.
The study builds on decades of research showing that the brain often uses visual information to augment hearing. That's why people in a noisy room are more likely to understand someone if they can see the speaker's lips.
"From our brain's point of view, we can hear with our eyes," says Bryan Gick, a professor of phonetics at the University of British Columbia in Vancouver. But he and his colleague Donald Derrick wanted to see whether hearing could also be influenced by our sense of touch.
'Pa' Versus 'Ba' Versus 'Da'
So they tested people's ability to tell the difference between sounds like "pa," which require a burst of air from the mouth, from sounds like "ba," which don't.
Volunteers listened to a recording of those sounds in an environment with a lot of background noise. Sometimes, the sounds were accompanied by a tiny puff of air against their hand or neck.
It turned out people were better able to identify "pa," sounds when they came with a puff of air. Moreover, when sounds like "da" were accompanied by a puff, people got confused and often thought they'd actually heard a "pa" sound.
That's probably because the brain knows which sounds are supposed to come with a puff and incorporates this into our perception of what we've heard, Gick says. The fact that people did this automatically, he says, suggests that our brains are wired from birth to incorporate information from other senses when we listen.
"From my point of view, we're whole-body perceiving machines," Gick says. "We just take all of the information that comes at us in our environment and merge it into a percept of something that happened in the world."
The Question Is Where Integration Happens
That view is becoming widely accepted among brain scientists, says David Ostry, a professor of psychology at McGill University in Montreal.
Ostry says these days the big scientific question isn't whether our brains routinely integrate sensory information, but how.
"It's up for grabs where within the brain this kind of integration is happening," he says.
One possibility is areas in the brain that process sensory information, Ostry says. But he notes it's also possible that integration takes place in the motor cortex, which controls our muscles.
Researchers say what they learn about how other senses influence hearing could help people with hearing loss, as well as people such as commercial airline pilots, who often have to decipher speech in a noisy environment.
Copyright 2022 NPR. To see more, visit https://www.npr.org.