Multisensory processing

The brain is truly clever: not only does it associate and transfer information, crossmodally, from one sense to another. And adapt, through crossmodal plasticity, to sensory impairments. (See our blog for the crossmodal correspondences between the senses and Crossmodal brain plasticity and empowering of sensory abilities.) It integrates and processes information from multiple senses too.


I have invited Dr Monica Gori, Head of the Unit for Visually Impaired People at the Italian Institute of Technology, to write this post on multisensory processing. Monica Gori has published over 130 scientific articles and book chapters as well as numerous conference abstracts. She has been the Principal Investigator on several research projects funded by the European Research Council, including the ERC StG MySpace project, she has developed early intervention systems like ABBI and iReach (some of which have been patented), and she has been awarded numerous prizes for her work (e.g., in the SmartCup and the TR35 Italian Innovation Award).

Image of one of Monica's highly textured paintings.

Monica also creates multisensory art.

 

 

 

Multisensory painting by Monica Gori. (Choline and enamel on canvas.)


An organism’s ability to relate to the external world depends on its ability to correctly process and interpret information from the environment. Our senses are our window to the world. They are the means by which we interact with the external world.


For example, when we are on the street, sight helps us understand where to go or where objects of interest are located, but hearing also helps us understand things, like whether a car is approaching or if there are people around us. Tactile information helps us understand how firmly we need to grip a handle to open a shop door or to shake hands with someone we haven’t seen in a long time. Smell allows us to detect if there is a restaurant nearby. These simple operations result from complex processes of sensory information processing. When a signal from the external world contacts our sensory receptors, our brain constructs a perception of the event and produces a response suitable for the situation.


However, even though we have many sensory modalities, we only have one brain that must integrate these sensory signals. Multisensory integration is a fundamental process that makes perception more than the sum of its parts, improving reaction time, precision, and response accuracy. Imagine if these senses were uncoordinated. Have you ever experienced a delay on television, where the voice is out of sync with the audio? It’s pretty annoying. In that case, the brain fails to synchronize the two signals because they are too far apart in time. The same happens during a thunderstorm when lightning is seen before the thunder is heard. Since light travels faster than sound, the two signals are perceived as separate. Typically, this doesn’t happen. Have you ever wondered why?


Each sensory system has its own timing in signal analysis. Visual and auditory signals, for instance, follow different pathways: vision through receptors in the eye and hearing through receptors in the ear. Our brain has learned to synchronize these two signals in space and time. Scientific results show that two signals from different modalities can be perceived as synchronous when their timing is within approximately 100 milliseconds. They are perceived as separate when the interval is longer, like on TV or with thunder and lightning. Fortunately, visual and auditory events are usually associated, and their delays fall within this window, allowing them to be perceived as a single event. Otherwise, imagine the confusion!


Our brain has also learned to integrate spatially discordant information within a limited window. If this spatial limit is exceeded, anomalies are perceived, much like in the ventriloquist effect, where the sound appears to come from the puppet’s mouth because the spatial distance is large, and vision strongly attracts the sound.


All these fascinating phenomena are explained by mathematical models developed and studied by researchers to understand better how our brains work. Over the past 20 years, I have studied the complexity of analyzing these signals, which allows for a great perceptual richness and their multisensory integration, and how it develops in children.


Our research has led to understanding how children and adults learn to integrate information from different sensory modalities in space and time and what happens when one sensory modality is missing. There are about 300 million visually impaired people worldwide, with approximately 15 million being children under the age of 15. Visual impairment in children hinders the development of many skills, such as movement, environmental perception, and play. Studying perceptual abilities and early intervention can make a difference, and it is a rapidly expanding field.


Monica has also very kindly suggested nine activities (WeDraw) for us. See our blog for more Activities; especially 34-36.


Some suggestions for further listening and watching:

Multi-Modal Perception – The basics

Multi Sensory Perception

Multisensory development and technology for children and adults

The Blind Kitchen Eating in the Dark

Try this bizarre illusion

Your sensory health matters. Here’s why

Leave a Reply

Your email address will not be published. Required fields are marked *