Multisensory processing

The brain is truly clever: not only does it associate and transfer information, crossmodally, from one sense to another. And adapt, through crossmodal plasticity, to sensory impairments. (See our blog for the crossmodal correspondences between the senses and Crossmodal brain plasticity and empowering of sensory abilities.) It integrates and processes information from multiple senses too.


I have invited Dr Monica Gori, Head of the Unit for Visually Impaired People at the Italian Institute of Technology, to write this post on multisensory processing. Monica Gori has published over 130 scientific articles and book chapters as well as numerous conference abstracts. She has been the Principal Investigator on several research projects funded by the European Research Council, including the ERC StG MySpace project, she has developed early intervention systems like ABBI and iReach (some of which have been patented), and she has been awarded numerous prizes for her work (e.g., in the SmartCup and the TR35 Italian Innovation Award).

Image of one of Monica's highly textured paintings.

Monica also creates multisensory art.

 

 

 

Multisensory painting by Monica Gori. (Choline and enamel on canvas.)


An organism’s ability to relate to the external world depends on its ability to correctly process and interpret information from the environment. Our senses are our window to the world. They are the means by which we interact with the external world.


For example, when we are on the street, sight helps us understand where to go or where objects of interest are located, but hearing also helps us understand things, like whether a car is approaching or if there are people around us. Tactile information helps us understand how firmly we need to grip a handle to open a shop door or to shake hands with someone we haven’t seen in a long time. Smell allows us to detect if there is a restaurant nearby. These simple operations result from complex processes of sensory information processing. When a signal from the external world contacts our sensory receptors, our brain constructs a perception of the event and produces a response suitable for the situation.


However, even though we have many sensory modalities, we only have one brain that must integrate these sensory signals. Multisensory integration is a fundamental process that makes perception more than the sum of its parts, improving reaction time, precision, and response accuracy. Imagine if these senses were uncoordinated. Have you ever experienced a delay on television, where the voice is out of sync with the audio? It’s pretty annoying. In that case, the brain fails to synchronize the two signals because they are too far apart in time. The same happens during a thunderstorm when lightning is seen before the thunder is heard. Since light travels faster than sound, the two signals are perceived as separate. Typically, this doesn’t happen. Have you ever wondered why?


Each sensory system has its own timing in signal analysis. Visual and auditory signals, for instance, follow different pathways: vision through receptors in the eye and hearing through receptors in the ear. Our brain has learned to synchronize these two signals in space and time. Scientific results show that two signals from different modalities can be perceived as synchronous when their timing is within approximately 100 milliseconds. They are perceived as separate when the interval is longer, like on TV or with thunder and lightning. Fortunately, visual and auditory events are usually associated, and their delays fall within this window, allowing them to be perceived as a single event. Otherwise, imagine the confusion!


Our brain has also learned to integrate spatially discordant information within a limited window. If this spatial limit is exceeded, anomalies are perceived, much like in the ventriloquist effect, where the sound appears to come from the puppet’s mouth because the spatial distance is large, and vision strongly attracts the sound.


All these fascinating phenomena are explained by mathematical models developed and studied by researchers to understand better how our brains work. Over the past 20 years, I have studied the complexity of analyzing these signals, which allows for a great perceptual richness and their multisensory integration, and how it develops in children.


Our research has led to understanding how children and adults learn to integrate information from different sensory modalities in space and time and what happens when one sensory modality is missing. There are about 300 million visually impaired people worldwide, with approximately 15 million being children under the age of 15. Visual impairment in children hinders the development of many skills, such as movement, environmental perception, and play. Studying perceptual abilities and early intervention can make a difference, and it is a rapidly expanding field.


Monica has also very kindly suggested nine activities (WeDraw) for us. See our blog for more Activities; especially 34-36.


Some suggestions for further listening and watching:

Multi-Modal Perception – The basics

Multi Sensory Perception

Multisensory development and technology for children and adults

The Blind Kitchen Eating in the Dark

Try this bizarre illusion

Your sensory health matters. Here’s why

Decay and maintenance of sensory memories

Previous research has found that memories based mainly on sensory information decay if they are not maintained. For example, people who have become blind are likely over time to lose their visual memories and, thus, the ability to visually imagine objects, shapes, and faces. However, people who have lost hearing seem to have memories of both sounds and voices.


This time I present some lived experiences shared by people who have become blind.


Often people who have just started learning braille by touch try to imagine the characters visually. For example, seeing “black spots on a white background” and then associating:

– the braille o (⠕) with the print close parenthesis

– the braille v (⠧) with the print capital letter L

– the braille s (⠎) with a snake

– the braille t (⠞) with a chair or set of stairs in profile1-2


They associate braille characters either with print characters; through focusing on angles, curves, and straight lines, or with object shapes. But this all stops when they become tactually more experienced: “I associated braille characters with regular print characters in the beginning (…). Not now”2.


(…), he could perfectly well, visually, imagine a painting hanging over his living-room sofa, but could no longer, visually, imagine his wife’s, his daughter’s, or even his own face: these had now become tactually familiar3.

So intense was my desire to know the face of a stranger, as someone I was meeting for the first time, that vivid pictures of the person’s possible face would flash through my memory so rapidly that I could hardly concentrate on what they were saying. Slowly, slowly, that also began to fade. (…) I began to lose the memory that things looked like anything. I found myself caught with a slightly abrupt sense of surprise when people would say to me, “John, would you like to know what I look like?”4


And then, when the other senses had taken over, John Hull described it as “being reborn”.


I discovered so many beautiful things. For example, trees came back. I used to love trees – the forest, the greenery. Now stars had gone. Clouds had gone. The horizon was no more. But now I gradually discovered trees came back. They came back acoustically. (…) I discovered that in the winter, the trees whistled, and cracked, and hissed. In the spring, they became all fluffy. In the summer, they were like the rolling ocean waves as the wind swept across them. In the autumn, they became all tinkly. (…) And I felt, how incredibly beautiful that is. Why did I never notice it before?4


It seems people who lose vision use information that transfers between the senses to retrieve visual memories and, thus, visually imagine the world around them; that is, until reaching a certain level of experience in the other senses. And; at that point, their visual memories are gone. (See our blog for the scientific approach and the crossmodal correspondences between the senses.) In contrast, it seems people who lose hearing remember both sounds and voices. Could this be because they previously perceived multisensory information, for example, lip-reading by vision and voice by hearing (sound on), and that visual information maintains their auditory memories and, thus, their ability to imagine the auditory world around them?

 

See our blog for Activities; especially 31-33.

_______________

1Graven, T. (2018). How individuals who are blind locate targets. British Journal of Visual Impairment, 36(1), 57-74.

2Graven, T. (2015). How blind individuals discriminate braille characters: An identification and comparison of three discrimination strategies. British Journal of Visual Impairment, 33(2), 80-95.

3Graven, T. (2009). Seeing Through Touch: When Touch Replaces Vision as the Dominant Sense Modality. Saarbrücken: VDM Verlag Dr. Müller AG & Co.

4John Hull Blindness and memory being reborn into a different world