Making Sense of Sounds

The human brain learns both sights and sounds through the same two-step cognitive process, suggests a new study by researchers from Georgetown University School of Medicine’s Department of Neuroscience. “We have long tried to make sense of senses, studying how the brain represents our multisensory world,” says the study’s senior investigator, Maximilian Riesenhuber, PhD.

In 2007, the research team—including Georgetown neuroscientists Xiong Jiang, PhD, and Josef P. Rauschecker, PhD, as well as graduate student Mark A. Chevillet—was the first to describe this two-step model in human learning of visual categories. They found that neurons in one area of the brain learn the representation of the stimuli, and another area of the brain categorizes that input to ascribe meaning to it—for example first seeing a car without a roof and then analyzing that stimulus in order to place it in the category of “convertible.”

To test whether the brain uses a similar two-step process for sound, the researchers trained 16 study participants to categorize monkey communication calls—real sounds that mean something to monkeys, but are alien in meaning to humans. The investigators divided the sounds into two categories based on prototypes: so-called “coos” and “harmonic arches.” Using an auditory morphing system, the investigators created thousands of monkey call combinations from the prototypes, including some very similar calls that required the participants to make fine distinctions. Learning to correctly categorize the novel sounds took about six hours.

Before and after training, fMRI data were obtained from the participants to look for changes in neuronal tuning in the brain following categorization training. Advanced fMRI techniques such as rapid adaptation (fMRI-RA) and multi-voxel pattern analysis were used, along with conventional fMRI and functional connectivity analyses. In this way, researchers were able to see two distinct sets of changes similar to those previously found in vision experiments: a representation of the monkey calls in the left auditory cortex, and tuning analysis that leads to category selectivity for different types of calls in the lateral prefrontal cortex.

Rauschecker says these findings could one day help scientists find new ways to restore sensory deficits. “Knowing how senses learn the world may help us devise work arounds in our very plastic brains. If a person can’t process one sensory modality, say vision, due to blindness, there could be substitution devices that allow visual input to be transformed into sounds. So one disabled sense would be processed by other sensory brain centers.”