Touch and Nature Learning

Some information is sensory specific, for example, colour to vision and temperature to touch. But most is perceived by several senses. An example is shape. Shape is perceived by vision, touch, and hearing. It transfers between the senses. And when used together, the senses perceive supplementary and overlapping information. (See our blog for Crossmodal correspondences between the senses, Vision, haptic touch, and hearing, and Multisensory processing).


In this post, I have invited the research team behind TOUCH to shed some light on how sighted people use their sense of touch when exploring real objects. The blog post is written by Dr Lisa Bowers, Open University, Professor Andrew Manches, University of Edinburgh, and Professor Laura Colucci-Gray, University of Edinburgh. Here, they present their previous research, leading up to TOUCH, on the role of touch when sighted children learn about nature.


Given the worsening state of our planet, an important challenge for education is how to support and promote a renovated relationship between humans and the natural world. Whilst schools offer an ideal context for this challenge, current teaching approaches tend to emphasise children’s visual experiences, exaggerated through increased engagement with screen devices: the world is presented as if ‘at a distance’, disconnected from everyday experiences and the matters we deeply care about. Although an education through the senses is predicated to support various hands-on initiatives, including outdoor and play-based experiences, its role is typically conceived as a means to actively explore visual information, rather than holding any intrinsic value in its own right.


Our research team has examined the role of touch in how children interacted with and described a selection of nature objects (e.g., a leaf, a shell, and a feather). We have found three key dimensions:

  • Propensity to Touch. Children differ in how much prompting they need to physically pick up objects – ranging from those who immediately touch objects to those who are reluctant even with prompting.
  • Touch interaction. Children differ in the
    © University of Edinburgh

    richness of their touch interaction – from simple tapping to rich exploration using both hands.

  • Touch communication. Children differ in the extent to which they use tactile language in their description of objects, as well as how they simulate touch through gesture when describing what they see.

 

As well as revealing intriguing similarities and differences between children, our research has highlighted the way that touch wasn’t simply a means for children to visually explore objects (although this was important, e.g., manipulating objects to inspect them from all angles) but offered much value as a unique mode of interaction. Indeed, many children would look away from objects whilst exploring them through touch. For example, a child might rub the inside and outside of the shell whilst reflecting on why one was smooth and the other rough and this process would recall prior experiences in other places or in the company of significant others – often positive emotional recollections.

Our research also revealed challenges children often had in describing tactile properties of objects – often drawing on analogies with other touch experiences when they lacked vocabulary. The identification of gestures simulating touch was particularly powerful in revealing how children had internalised touch experiences in their concepts of objects as they described them. This internalisation of sensory experience in conceptual thinking is a key claim of embodied theories of cognition. However, the benefits of tactile learning extend beyond basic sensory exploration. Somatosensory perception is central to early human development and the explicit incorporation of touch as a way to feel, interact, and communicate through the body can enhance learning, with increased engagement and memorability.

Somatosensory Cortex Stimulation by Touch: Somatosensory Cortex - Nerve impulse - Touch
Created by Lisa Bowers, 2025

The foundational work of research studies such as the one described above has highlighted various possible avenues for design – including ways to encourage children’s propensity to touch, ways to encourage children’s exploration of tactile properties in greater depth, or ways to develop children’s tactile language (and possibly gestures) as a means to communicate tactile properties. We have explored some of these opportunities through classroom and outdoor activities, as well as the development of haptic prototype designs where children receive haptic feedback from a special stylus moved over a screen device. For these haptic designs, tactile feedback not only provides a means for children to tactilely explore ‘less accessible’ objects like a bee or unfamiliar plant, but can prime children to be more curious about the feel of real-world natural objects. We are also exploring the potential of this work to quantify the role of touch in order to examine how different factors influence children’s touch interaction and communication (e.g., their age, confidence or nature experiences), and to evaluate how, and if, our interventions influence the role of children’s touch in children’ science learning.

In our future research, touch is conceived of as a medium that builds on human capacities to feel and perceive what might be distant and inaccessible, but also to pay attention to aspects of the world that may go unnoticed or taken for granted. We are now asking ourselves and all we are engaging with a key question – what is the role of touch in what it means to be curious about, explore, think, feel, communicate, and be connected to our natural world?


See our blog for Activities; especially 62-64.

Reading braille in colour

Previous research has found that people who have just started reading braille by haptic touch try to imagine the characters visually. For example, seeing black spots on a white background and associating the formation of these with regular print characters, objects they have seen, or both. When they become tactually more experienced, this stops. In contrast, people who are born blind recognise the braille characters through either their quantity and location of dots or their tactile global shape1. (See our blog for the scientific approach, Vision, haptic touch, and hearing, and Decay and maintenance of sensory memories.)


This time, I wanted to explore what happens when people stop trying to imagine the braille characters visually. To shed some light on this, I have invited I.A. to share her experiences. Born partially sighted, with about 5% vision in one eye, I.A. started reading and writing regular print. I.A. learned braille on her own at around 13 years old: first memorising the braille alphabet, using a combination of vision and haptic touch. Then reading materials published in both regular print and braille. About three years later, I.A. was no longer able to read regular print. She now knew all the braille characters by haptic touch, but needed help to perfect her reading technique. Today, I.A. has been reading and writing braille by haptic touch alone for more than 25 years. She has been teaching braille to people who have become blind for more than 10 years. And she is an appointed Board Member of the braille authority in her country. I.A. approved this text before we posted it on our blog. The journey she is taking us on is rather unexpected.


For the first two or three years of reading braille by haptic touch alone, I.A. saw the dots in each character as mini light bulbs in her mind’s eye. She concentrated on how many they were and where they were located.


Gradually, I.A. started perceiving the tactile global shape of short two and three letter words. For example, of ⠉⠁⠞ (cat) and ⠙⠕⠛ (dog). She still reads longer words letter by letter, but she has stopped seeing their dots as mini light bulbs in her mind’s eye.


Reading by haptic touch alone, I.A. recognises the braille characters through their quantity and location of dots. She recognises short two and three letter words by their tactile global shape.


As I.A. became more experienced in reading braille by haptic touch alone, the letters started appearing in colour – just as they had done in regular print. For example, a in red; b in dark blue, c in light yellow, d in dark yellow, e in pale blue; f in blue-grey; g in green; h in beige; i in translucent white, j in white, and so on.


And the numbers too: 1 in white; 2 in yellow; 3 in blue, 4 in light yellow; 5 in green; 6 in blue, 7 in white, 8 in red, 9 in brown, and 0 in silver grey. Some punctuations and signs also have the same colour as in regular print; like “division” (÷) which is yellow, while others gained a colour in braille, like “equals” (=) which is now mossy green.


I.A. experiences synaesthesia. She associates certain letters, numbers, punctuations, and signs with certain colours (see our blog for the crossmodal correspondences between the senses and the intriguing association between sounds and colours).


I.A.’s synaesthesia appears only when reading. She does not associate letters, numbers, punctuations, and signs with colour when writing. But, as soon as she checks her spelling, they appear in colour again.


When reading shorter words, like “cat” and “dog”, I.A. first detects the tactile global shape of the word and then associates it with colour. That is, the colour when combining all the letters. The word “dog” for example is a combination of (d) dark yellow, (o) silver grey, and (g) green. I.A. sees “dog” as a mossy green word.


According to I.A. the colour helps her distinguish between words with a similar tactile global shape. For example, ⠙⠕⠛ (dog) and ⠋⠕⠛ (fog). While “dog” has a mossy green colour, “fog” is light green; that is, a combination of (f) blue-grey, (o) silver grey, and (g) green.


Longer words, that I.A. reads letter by letter, are coloured by their first letter. For example, the word “braille” is dark blue and “crossmodal” is light yellow.


It seems I.A.’s subjective crossmodal correspondences are not sensory specific: they exist regardless of whether she is reading regular print or braille. Could they be linked to information that transfers between vision and touch? For example, shape. Or, are they rather linked to the sound of the letters, numbers, punctuations, and signs (see our blog for the scientific approach, the crossmodal correspondences between the senses and the intriguing association between sounds and colours)?


See our blog for Activities; especially 58-61.


_______________

1Graven, T. (2015). How blind individuals discriminate braille characters: An identification and comparison of three discrimination strategies. British Journal of Visual Impairment, 33(2), 80-95.