If you’ve ever clenched up at the sound of nails on a chalkboard, or felt a pleasant chill when listening to an opera soprano, then you have an intuitive sense of the way our brains sometimes mix information from our senses. For the latest issue of Nautilus magazine I wrote a story about a woman whose brain mixes more than most, allowing her to feel many types of sounds on her skin.
Over the past decade or so, neuroscientists have revamped their view of how the brain processes sensory information. According to the traditional model, the cortex, or outer layers of the brain, processes only one sense at a time. For example, the primary visual cortex at the back of the head was thought to process only input from the eyes, while the auditory cortex above the ears dealt with information from the ears and the somatosensory cortex near the top of the head took in signals from the skin. But a growing number of studies have found that these cortical areas actually integrate information from many senses at once.
One of the most fascinating examples of this line of work, just published in Psychological Sciencepublished in Psychological Science, took advantage of a technology called transcranial direct current stimulation, or tDCS. This tool essentially gives researchers a safe, non-invasive way to activate specific parts of the human brain. Pretty wild, right? Here’s how it works. Researchers place two electrodes in various positions on a volunteer’s scalp. A small electric current passes between the electrodes, stimulating the neurons underneath.
In the new study (cleverly named “Feeling Better”) neuroscientist Jeffrey Yau of Johns Hopkins University used tDCS to stimulate the brain as volunteers performed two different tasks related to touch perception. One task is similar to reading Braille: blindfolded volunteers placed their fingers over gratings of bars of varied widths and spacing. The closer the bars, the more difficult it is for someone to determine whether there is one bar or two. The smallest distance at which the volunteer can correctly make this call is called the “spacial acuity.”
The second task measures the frequency of vibrations, similar to the different kinds of rumblings you might feel while waiting on a subway platform. On a given trial, volunteers use their index fingers to feel vibrations produced by a metal probe. They feel two vibrations back to back and then judge which they perceive to be stronger.
Yau ushered participants through each of these tasks before and after stimulating their brains. He found that activating volunteers’ primary visual cortex improved their tactile acuity, whereas stimulating their primary auditory cortex improved their ability to discriminate between different tactile frequencies.
What does this mean?
These findings make sense, Yau says, if you reframe the traditional view of how the cortex is organized. As I mentioned, the primary visual cortex has typically been thought of as the region that processes input from the eyes. But what if instead it was a region that processed information about shape, no matter what organ that information came from? Most of the time, shape information comes from the eyes, but sometimes—such as in this experiment—it can come from touch. Similarly, the primary auditory cortex might not be tailored for interpreting sounds, per se, but rather frequency information of any kind, including but not limited to sounds.
Yau speculates that we should be thinking differently about the other senses, too. The somatosensory cortex might process skin input, sure, but also other information related to keeping track of our body in physical space.
“Within the last six or seven years, so much evidence has emerged that shows that early sensory cortex is not modality specific,” Yau says. Nevertheless, because subfields have built up around particular senses, Yau says it will probably take awhile before traditional theory of uni-sensory processing is dethroned. “That’s the idea that is always pushed in the textbooks. I think it’s hard to fight that dogma.”