Using the senses when vision and hearing are impaired

The brain integrates simultaneous information from several senses. It transfers information from one sense to another. And, it reorganises itself when one of the senses is impaired. All of this improves our reaction time, precision, and recognition accuracy (see our blog for Multisensory processing, crossmodal correspondences between the senses, and crossmodal brain plasticity and empowering of sensory abilities). But what happens when more senses are impaired?


I have invited K.H. to share her experiences of living with deafblindness. K.H. was born with about 10% vision – a severe visual impairment. She has had periods of losing and regaining vision: currently with about 0.5% visual acuity, which corresponds to being blind. In addition, K.H. was born with moderate to severe hearing loss. Her hearing loss is most severe in the middle of the pitch range, which is where people’s voices typically lie when talking. She struggles to recognise voices, where a sound comes from, and how far away they are. K.H. approved this text before we posted it on our blog.


So, how does K.H. use her senses on their own and together?


“It’s like working on a jigsaw puzzle where a lot of the pieces are missing: I have to search for as many pieces as I possibly can using vision, hearing, and touch and, then, piece them all together to get the full picture. I have to guess a lot, which is exhausting, and I often get it wrong because I’ve not recognised or found a piece. I’m constantly in a state of high alert and fear of making a fool of myself.”


K.H. describes searching for specific sensory information, that what she searches for is determined by what she is doing, and where one sense fails other senses step in. In traffic, for example, she describes using vision alone and hearing alone to recognise separate landmarks, like a red house by vision and an open area by hearing. She uses hearing and vision to determine if cars and people are moving, and in what direction – first hearing, then vision to double-check. K.H. describes using hearing alone to judge the distance to walls, other people, etc., and her feet to check tactile information about kerbs and steps. And both hearing and touch when crossing the street – listening for the sound on a signal-controlled crossing first, then touching the spinning cone to double-check.


I have become experienced in knowing what sensory information I need, for example, from vision to support hearing.


Is K.H. ever not in a state of high alert?


She describes dropping her guard when she is home alone because nobody might talk to her, show her something, etc. Another example is at a concert together with a guide she trusts and who is also skilled in using social haptic communication. Certainly, she takes every opportunity to have a “sense break”: relaxing her vision either by playing a game on her mobile phone with easy-to-see colours and lots of repetitions or by lying down in a dark room. She relaxes her hearing by listening to music (the melody, not the lyrics) or a podcast that she does not have to pay attention to. K.H. sometimes relaxes both senses and sometimes one sense while focusing on the other; for example, relaxing vision by playing a game on her mobile phone or touching the texture of her clothes while focusing her hearing on the lecturer.


I feel I hear them better when vision doesn’t have to work so hard or is completely “closed off”.


Also, physical exercise in non-demanding sensory environments gives her brain some time off.


Does K.H. enjoy a sensory experience – do they give her pleasure?


K.H. describes enjoying birdsong and music. And bright colours, like the blue sky or sea, green trees or hills, and the reds and yellows in a sunset or flower: especially, in combination with the smell and sound of the sea or forest. But she has to be either with a guide or on her own and standing still or sitting down. As soon as she knows somebody other than her guide might seek her attention or she starts moving, she goes right back to high alert mode.


It seems K.H. has a plan for what to search for both in the preferred and the supplementing sense, often used to double-check the correctness of the favoured sense. K.H. does not describe focusing on any of the information that research has found to transfer from one sense to another, like shape. Perhaps she cannot see or hear it? It is almost as if K.H. perceives the world linearly – in a string of well-organised sensory information. Only when she is enjoying a sensory experience, does K.H. describe appreciating a scene of multisensory information, for example, the birdsong, colour, and smell of the forest.


See our blog for Activities; especially 37-39.

Multisensory processing

The brain is truly clever: not only does it associate and transfer information, crossmodally, from one sense to another. And adapt, through crossmodal plasticity, to sensory impairments. (See our blog for the crossmodal correspondences between the senses and Crossmodal brain plasticity and empowering of sensory abilities.) It integrates and processes information from multiple senses too.


I have invited Dr Monica Gori, Head of the Unit for Visually Impaired People at the Italian Institute of Technology, to write this post on multisensory processing. Monica Gori has published over 130 scientific articles and book chapters as well as numerous conference abstracts. She has been the Principal Investigator on several research projects funded by the European Research Council, including the ERC StG MySpace project, she has developed early intervention systems like ABBI and iReach (some of which have been patented), and she has been awarded numerous prizes for her work (e.g., in the SmartCup and the TR35 Italian Innovation Award).

Image of one of Monica's highly textured paintings.

Monica also creates multisensory art.

 

 


 


Multisensory painting by Monica Gori. (Choline and enamel on canvas.)

 

An organism’s ability to relate to the external world depends on its ability to correctly process and interpret information from the environment. Our senses are our window to the world. They are the means by which we interact with the external world.


For example, when we are on the street, sight helps us understand where to go or where objects of interest are located, but hearing also helps us understand things, like whether a car is approaching or if there are people around us. Tactile information helps us understand how firmly we need to grip a handle to open a shop door or to shake hands with someone we haven’t seen in a long time. Smell allows us to detect if there is a restaurant nearby. These simple operations result from complex processes of sensory information processing. When a signal from the external world contacts our sensory receptors, our brain constructs a perception of the event and produces a response suitable for the situation.


However, even though we have many sensory modalities, we only have one brain that must integrate these sensory signals. Multisensory integration is a fundamental process that makes perception more than the sum of its parts, improving reaction time, precision, and response accuracy. Imagine if these senses were uncoordinated. Have you ever experienced a delay on television, where the voice is out of sync with the audio? It’s pretty annoying. In that case, the brain fails to synchronize the two signals because they are too far apart in time. The same happens during a thunderstorm when lightning is seen before the thunder is heard. Since light travels faster than sound, the two signals are perceived as separate. Typically, this doesn’t happen. Have you ever wondered why?


Each sensory system has its own timing in signal analysis. Visual and auditory signals, for instance, follow different pathways: vision through receptors in the eye and hearing through receptors in the ear. Our brain has learned to synchronize these two signals in space and time. Scientific results show that two signals from different modalities can be perceived as synchronous when their timing is within approximately 100 milliseconds. They are perceived as separate when the interval is longer, like on TV or with thunder and lightning. Fortunately, visual and auditory events are usually associated, and their delays fall within this window, allowing them to be perceived as a single event. Otherwise, imagine the confusion!


Our brain has also learned to integrate spatially discordant information within a limited window. If this spatial limit is exceeded, anomalies are perceived, much like in the ventriloquist effect, where the sound appears to come from the puppet’s mouth because the spatial distance is large, and vision strongly attracts the sound.


All these fascinating phenomena are explained by mathematical models developed and studied by researchers to understand better how our brains work. Over the past 20 years, I have studied the complexity of analyzing these signals, which allows for a great perceptual richness and their multisensory integration, and how it develops in children.


Our research has led to understanding how children and adults learn to integrate information from different sensory modalities in space and time and what happens when one sensory modality is missing. There are about 300 million visually impaired people worldwide, with approximately 15 million being children under the age of 15. Visual impairment in children hinders the development of many skills, such as movement, environmental perception, and play. Studying perceptual abilities and early intervention can make a difference, and it is a rapidly expanding field.


Monica has also very kindly suggested nine activities (WeDraw) for us. See our blog for more Activities; especially 34-36.


Some suggestions for further listening and watching:

Multi-Modal Perception – The basics

Multi Sensory Perception

Multisensory development and technology for children and adults

The Blind Kitchen Eating in the Dark

Try this bizarre illusion

Your sensory health matters. Here’s why