How infants who are blind integrate tactile and auditory information

The brain appears to integrate simultaneous information from all the senses from birth. (See our blog for Multisensory processing.) However, when the infant is fully sighted, vision most often takes the lead. So what happens when vision is impaired?

This time, I have invited Stefania Petri, Unit for Visually Impaired People (the UVIP Lab), the Italian Institute of Technology, to write about the integration of tactile and auditory cues in infants. Stefania is part of the MySpace project, which investigates how infants and children who are blind process audio-tactile information. The project is led by Dr Monica Gori, Head of the UVIP Lab, and Stefania contributes to the development of the early intervention system iReach.

For newborns, vision is not only about recognising faces and objects. Sight guides movement, play, and exploration. It allows infants to coordinate their actions, interact with caregivers, and gradually make sense of the world. When vision is missing or severely impaired, these basic experiences are disrupted from the very beginning of life. Indeed, infants with visual impairments often face delays in motor development, difficulties in social interaction, and challenges in learning how to explore space.

 

Why Touch and Sound Matter

Vision usually guides the other senses, helping infants build a coherent sense of space. For a sighted child, seeing a toy, hearing its sound, and touching it all come together to form a single, integrated experience. To construct this spatial map, infants who are blind must rely on other senses, such as touch and hearing.

Both senses are present from birth, and both provide spatial cues: touch gives direct, body-centered information, while hearing allows orientation toward events and objects at a distance. Understanding how these two senses work together in the absence of vision is crucial for developing strategies that support the growth of children who are blind.

 

Studying multisensory spatial perception

To explore this, we used a well-established paradigm – presenting auditory and tactile stimulations on the hands of the infants. We used a non-invasive device and collected behavioural data. The stimulation could be presented in a congruent way, with touch and sound on the same side of the body. Or, in an incongruent way, for example, touch on the right and sound on the left-hand side. By comparing the responses from infants who were blind and infants who were sighted, it became possible to explore how the two groups oriented and how quickly they reacted under different conditions.

This method may seem simple, but it addresses a fundamental question: when vision is absent, how do infants resolve conflicts between touch and sound? And do they still benefit when both cues point in the same direction?

 

What We Found

The results revealed clear differences between the two groups:

  • When touch and sound are in conflict — for example, when a vibration is felt on one hand, but the sound comes from the opposite side — infants who are blind are less likely than their sighted peers to orient toward the sound. This suggests that they rely more strongly on tactile cues when making spatial decisions.
  • When touch and sound are congruent, infants who are blind show evidence of multisensory integration. Specifically, their reaction times are faster when both cues are presented together compared to when they are presented separately. While sighted infants tend to integrate such cues more efficiently, infants who are blind nevertheless reveal that they can combine information across senses in a beneficial way.

(Top) Four experimental conditions: auditory stimulation alone, congruent audio-tactile stimulation, incongruent audio-tactile stimulation, and tactile stimulation alone. (Bottom) Results: (a) percentage of orienting responses directed toward the auditory stimulus and (b) reaction times to the stimulus. Blue bars represent sighted infants (S), and yellow bars infants who were severely visually impaired. Bold black lines indicate statistically significant differences between conditions.

Published with permission. Gori, M., Campus, C., Signorini, S., Rivara, E., & Bremner, A. J. (2021). Multisensory spatial perception in visually impaired infants. Current Biology, 31(22), 5093-5101.e5. https://doi.org/10.1016/j.cub.2021.09.011

These findings highlight an important message: even without vision, multisensory stimulation, particularly the integration of sound and touch—can enhance performance and support the gradual development of spatial and motor skills.

 

Practical Implications

These insights are not just theoretical. They guide the development of both habilitation and rehabilitation strategies and supportive technologies. For instance, play-based training sessions that combine vibration with sound in congruent way could strengthen early sensorimotor skills and might help infants who are blind practice reaching and moving toward objects.
One practical example inspired by this research is iReach. iReach is a small, wearable system made of two units, or tags that communicate via wireless. By attaching an anchor tag to a bracelet on the child’s wrist and another tag to a toy, the device allows infants to sense changes in vibration and sound as they approach the object: As the child moves closer to the toy, the bracelet changes its vibration and sound, giving intuitive feedback about distance.

An early prototype has been tested in a safe, playful setting with sighted children who were blindfolded. In one of the activities, the children had to place objects into a box positioned farther away, which contained the spatial reference tag.

a) iReach units: Tag (left); Anchor (right). b) Example of Tag positions: Anchor on infant’s body midline (left); external object (right). c) Example of use of iReach: The sound emitter and waveform icons represent the auditory and tactile stimuli, respectively. An increase in icons size indicates a corresponding increase in feedback intensity and frequency.

Published with permission: Gori, M., Petri, S., Riberto, M., & Setti, W. (2025). iReach: New multisensory technology for early intervention in infants with visual impairments. Frontiers in Psychology, 16(May) https://do.org/10.3389/fpsyg.2025.1607528

When wearing the iReach bracelet, the children completed the task both faster and with more accurate movements. These early observations suggest that iReach can make exploration more intuitive and engaging for children who are blind.

Importantly, iReach is not a sensory substitution device, which often overload users with complex signals, it uses a child-friendly “language” of touch and sound to encourage active movement and exploration.

 

Conclusion

Infants who are blind grow up in a world where touch and hearing are the main senses that support their exploration of the world. Our studies show that they rely more on touch than on sound when the senses are in conflict, but they also benefit from integrating the two when the information is aligned. Recognizing how touch and sound work together, we can take important steps toward creating early interventions that respect children’s natural abilities and provide them with the best possible start in life.

See our blog for Activities; especially 79-81.

 

Some suggestions for further listening and watching:

Baby’s Fine and Gross Motor Skills

Baby Hearing Development

Beyond the Basic Senses

Get “Inside the Mind of a Baby”

Multisensory spatial perception in visually impaired infants

The Tactile System & Body Awareness In The First 3 Months

Vision Development: Newborn to 12 Months

What Your Baby Sees

Your baby’s sense of touch

Choosing food

When it comes to eating, “The first bite is with the eyes”. Feeling the texture stimulates the appetite. 75-95% of the flavour comes from smell and the flavour is enhanced by “sonic seasoning“. Eating is a truly multisensory experience (see our blog for Multisensory processing and Food for thought: taste, smell and flavour.) And it seems the senses also help us decide what food is familiar and, thus, whether to eat it or not.

In this post, I have invited Associate Professor Suzanna Forwood, Anglia Ruskin University to reflect on how the senses affect what food we decide to eat or not to eat. Suzanna Forwood conducts research on the factors that determine our food choices, including available tools for healthy choices.

When offered a menu, most people do not seek out the least familiar dish for their dinner.  This is because most of us need food to be familiar for it to be appealing, and there are good reasons for this. From an evolutionary perspective, familiar food eaten in the past without any ill-effects is more likely to be safe this time, and safe food is essential for survival. From a psychological perspective, familiar food is food that we know about how filling or tasty it might be, and we need this information when choosing something to eat so we can match our current appetite.

When you reflect on it, eating is a profoundly unusual sensory experience. On the one hand, exploring food is necessary to gain the sensory information that makes it familiar: we don’t know whether we like it or want to eat it until it is familiar. On the other hand, eating food is bound up in a social contract: there is an expectation that we know what food we like and that we eat food we are served. This tension is particularly problematic for children who are still learning about the world and find it hard to express what they like. Their reluctance to like and eat less familiar foods can look like picky eating.

Sensory Education activities are designed for children and break this tension: food is not a meal but a game or classroom activity. The philosophy, originating in France and Scandinavia in the Sapere movement, is to offer children the chance to explore food in a structured and non-judgemental activity away from mealtimes. Children are provided with samples of foods, typically fruits or vegetables, to explore using all their senses. Golden rules for these activities are that no one must try or like any of the foods. Activities include variations of a single food or focus on specific senses. Food is discussed in terms of its sensory properties as experienced by the child with no expectation that the child has a preference. Sensory education therefore supports children by growing their familiarity with novel foods, as well as their vocabulary for talking about their sensory experience and communicating their preferences1,2.

The need for familiarity presents challenges when an adult loses part of their sensory world.  Eating is fully multisensory activity: we eat with our eyes, our hands, our mouths, our noses and our ears, and our experience of food merges senses. For example, what we experience as flavour combines information from tastebuds in the mouth and smell receptors in the nose, and what we experience as texture combines information from touch receptors in the mouth and sound receptors in the ear. Simply removing one sensory domain can alter how food is experienced. You can explore this for yourself by tasting a food while holding your nose to block smell or wearing ear defenders to block sound. Doing either of these will change the holistic sensory experience of eating the food: the food will no longer be quite so familiar and there may be a change in how much you like or dislike it.

It’s complicated to adjust to a radical change in sensory or motor function for many reasons but retaining dietary variety and pleasure in eating remains important for health and wellbeing. At a very practical level, then, Sensory Education might offer a structured method for supporting a process of re-learning foods in the new sensory world – re-experiencing foods from an altered multisensory perspective and re-evaluating what is familiar and liked.  Research has not yet explored whether Sensory Education can support adults experiencing sensory difficulty with their diets. We have tried co-developing Sensory Education activities with young adults, and the activities were enjoyed. The next step is to explore whether similar activities can be used with adults adjusting to sensory difficulty, such as visual impairment, or motor difficulty, such as recovery following stroke.

My father is that rare person who chooses unfamiliar foods – I think he enjoys the excitement when on holiday or somewhere new. And I remember thinking this was brave – like most children, I preferred familiar food and was amazed at someone who chose to eat something unknown. As it happened, when the food arrived, he would be presented with a regional dish or a local speciality, and when I tasted it, I learned that unfamiliar foods can be delicious and, in time, familiar favourites. It requires a kind of bravery to explore the unknown.

 

See our blog for Activities; especially 76-78.

 

Some suggestions for further listening, reading, and watching:

Dining in the dark

Eating for children with Sensory Difficulties

5 Sensory Tips for Picky Eaters

How to get your taste and smell back after Covid

I Can’t Taste Anything

Sapere

_______________

1Mustonen, S., Rantanen, R, & Tuorila, H. (2009). Effect of sensory education on school

children’s food perception: A 2-year follow-up study. Food Quality and Preference, 20(3), 230-240. DOI: https://doi.org/10.1016/j.foodqual.2008.10.003

2Reverdy, C. (2011). Sensory Education: French Perspectives. In V. R. Preedy, R. R. Watson,

and C. R. (Eds.) Handbook of Behavior, Food and Nutrition (pp. 143-157) New York: Springer. DOI: https://doi.org/10.1007/978-0-387-92271-3_11