Reading braille in colour

Previous research has found that people who have just started reading braille by haptic touch try to imagine the characters visually. For example, seeing black spots on a white background and associating the formation of these with regular print characters, objects they have seen, or both. When they become tactually more experienced, this stops. In contrast, people who are born blind recognise the braille characters through either their quantity and location of dots or their tactile global shape1. (See our blog for the scientific approach, Vision, haptic touch, and hearing, and Decay and maintenance of sensory memories.)


This time, I wanted to explore what happens when people stop trying to imagine the braille characters visually. To shed some light on this, I have invited I.A. to share her experiences. Born partially sighted, with about 5% vision in one eye, I.A. started reading and writing regular print. I.A. learned braille on her own at around 13 years old: first memorising the braille alphabet, using a combination of vision and haptic touch. Then reading materials published in both regular print and braille. About three years later, I.A. was no longer able to read regular print. She now knew all the braille characters by haptic touch, but needed help to perfect her reading technique. Today, I.A. has been reading and writing braille by haptic touch alone for more than 25 years. She has been teaching braille to people who have become blind for more than 10 years. And she is an appointed Board Member of the braille authority in her country. I.A. approved this text before we posted it on our blog. The journey she is taking us on is rather unexpected.


For the first two or three years of reading braille by haptic touch alone, I.A. saw the dots in each character as mini light bulbs in her mind’s eye. She concentrated on how many they were and where they were located.


Gradually, I.A. started perceiving the tactile global shape of short two and three letter words. For example, of ⠉⠁⠞ (cat) and ⠙⠕⠛ (dog). She still reads longer words letter by letter, but she has stopped seeing their dots as mini light bulbs in her mind’s eye.


Reading by haptic touch alone, I.A. recognises the braille characters through their quantity and location of dots. She recognises short two and three letter words by their tactile global shape.


As I.A. became more experienced in reading braille by haptic touch alone, the letters started appearing in colour – just as they had done in regular print. For example, a in red; b in dark blue, c in light yellow, d in dark yellow, e in pale blue; f in blue-grey; g in green; h in beige; i in translucent white, j in white, and so on.


And the numbers too: 1 in white; 2 in yellow; 3 in blue, 4 in light yellow; 5 in green; 6 in blue, 7 in white, 8 in red, 9 in brown, and 0 in silver grey. Some punctuations and signs also have the same colour as in regular print; like “division” (÷) which is yellow, while others gained a colour in braille, like “equals” (=) which is now mossy green.


I.A. experiences synaesthesia. She associates certain letters, numbers, punctuations, and signs with certain colours (see our blog for the crossmodal correspondences between the senses and the intriguing association between sounds and colours).


I.A.’s synaesthesia appears only when reading. She does not associate letters, numbers, punctuations, and signs with colour when writing. But, as soon as she checks her spelling, they appear in colour again.


When reading shorter words, like “cat” and “dog”, I.A. first detects the tactile global shape of the word and then associates it with colour. That is, the colour when combining all the letters. The word “dog” for example is a combination of (d) dark yellow, (o) silver grey, and (g) green. I.A. sees “dog” as a mossy green word.


According to I.A. the colour helps her distinguish between words with a similar tactile global shape. For example, ⠙⠕⠛ (dog) and ⠋⠕⠛ (fog). While “dog” has a mossy green colour, “fog” is light green; that is, a combination of (f) blue-grey, (o) silver grey, and (g) green.


Longer words, that I.A. reads letter by letter, are coloured by their first letter. For example, the word “braille” is dark blue and “crossmodal” is light yellow.


It seems I.A.’s subjective crossmodal correspondences are not sensory specific: they exist regardless of whether she is reading regular print or braille. Could they be linked to information that transfers between vision and touch? For example, shape. Or, are they rather linked to the sound of the letters, numbers, punctuations, and signs (see our blog for the scientific approach, the crossmodal correspondences between the senses and the intriguing association between sounds and colours)?


See our blog for Activities; especially 58-61.


_______________

1Graven, T. (2015). How blind individuals discriminate braille characters: An identification and comparison of three discrimination strategies. British Journal of Visual Impairment, 33(2), 80-95.

Touching the Future: Exploring Haptics and Multisensory Experiences in Virtual Reality

In real life environments, the brain associates and transfers information, crossmodally, from one sense to another. It integrates and processes information from multiple senses. And emotional perceptions too. (See our blog for the crossmodal correspondences between the senses, crossmodal brain plasticity, multisensory processing, and emmotional perceptions). But what happens in Virtual Realities? Virtual Realities are created to trick us into believing something is real when it is not. They can be all visual, auditory, or tactile – and even multisensory


I have invited Associate Professor Mounia Ziat, Bentley University to write about the sense of touch in multisensory virtual realities. That is, on haptic technologies that simulate the tactile and kinaesthetic sensations we feel when interacting with the real world. Mounia Ziat has published extensively on perception and human interaction with natural and artificial environments. And, she has been awarded numerous prizes and grants for her work (e.g., from the EuroHaptics Society, National Science Foundation,  America’s Seed Fund, and Google Research). In this blog post, Mounia explores the transformative potential of haptics in virtual reality, with applications that enrich accessibility, emotional well-being, rehabilitation, and sensory understanding.


The sense of touch, including its interplay with other sensory modalities, is essential to how we experience and navigate the world. In virtual reality (VR), haptic technologies are unlocking new dimensions of sensory engagement, from emotional resonance to crossmodal integration with temperature, sound, and vision.


Multisensory Integration: The Role of Touch and Temperature

Touch and temperature are deeply intertwined in our perception of the world. Studies on the hue-heat hypothesis, for instance, show how color can influence temperature perception: blue hues can make hot objects feel cooler, while red hues can intensify the sensation of cold​. These crossmodal interactions highlight the importance of synchronizing sensory inputs for a coherent and meaningful experience. In VR, combining haptics with temperature modulation can create more immersive and realistic interactions. For example, a VR system could use haptic feedback and visual cues to replicate the warmth of a sunny beach or the chill of a snowstorm, enhancing the user’s sense of presence.


Haptics in Emotional and Interpersonal Experiences

Touch isn’t just functional—it’s deeply emotional. Haptic sensations in VR can evoke feelings of comfort, fear, or excitement, depending on how they are designed. Research on the cutaneous rabbit illusion, where participants feel “hops” on their arm, shows how tactile feedback can influence emotions like arousal and valence.


Wearable haptic systems, such as gloves, smart clothing, and vests, are being developed to provide tactile feedback that carries emotional meaning. These devices can simulate caresses, tickling sensations, or even the comforting pressure of a hug, paving the way for emotionally expressive communication in virtual and augmented realities.


However, existing haptic stimuli often lack the ability to fully capture the emotional nuances of real-world touch. To unlock the full potential of haptics, researchers should design stimuli that evoke emotions, identify socially acceptable touchpoints, and improve the integration of tactile feedback into eXtended Reality (XR) systems. These advancements could transform how people connect and communicate, especially in mediated or virtual environments.


Applications Across Fields

Haptics is already making waves across diverse fields:

  • Healthcare and Rehabilitation: Haptic feedback in VR has been instrumental in neurorehabilitation for individuals with upper limb paralysis. Devices like robotic exoskeletons and haptic gloves provide tactile stimulation during therapy, promoting motor and sensory recovery while engaging patients in interactive exercises. These technologies not only improve physical outcomes but also enhance patient motivation by integrating gamified elements into therapy. Mid-air Haptics has similarly been used to reduce anxiety during medical procedures, demonstrating the versatility of haptic technology in healthcare.
  • Art and Immersion: In artistic VR installations, passive haptics—like vibrations underfoot when “walking” on virtual paintings—can be paired with temperature shifts to simulate the feel of stepping on different materials.
  • Accessibility: For individuals with sensory challenges, haptics can provide more nuanced and informative feedback, bridging gaps in sensory perception.

These applications demonstrate how haptics can enrich both functional and creative experiences.


Future Challenges and Opportunities

As promising as haptic technology is, challenges remain. Designing devices that seamlessly integrate touch feedback is technically complex. Moreover, creating socially acceptable and emotionally expressive tactile stimuli requires careful consideration of cultural and personal differences. Future research will likely explore these intersections, advancing haptic systems that are not only precise and realistic but also adaptable and inclusive.


Conclusion

Haptics is at the frontier of sensory innovation, transforming virtual reality into a multisensory experience that engages touch, vision, audition, and emotion. By harnessing these technologies, we can create inclusive, immersive environments that redefine how we interact with both the virtual and physical worlds.


As we move forward, the integration of haptics in neurorehabilitation, art, and accessibility offers exciting possibilities—not just for technology, but for human connection and understanding.


See our blog for Activities; especially 55-57.


Some suggestions for further listening and watching

Emergence Gallery: Virtual Walking

Haptic gloves help blind people ‘see’ art

Is That my Real Hand?

Smart Clothing

The Predictive Perception of Dynamic Vibrotactile Stimuli Applied to the Fingertip

The VR Dilemma: How AR and VR redefine our reality

Understanding Affective Touch for Better VR Experience

Virtual reality: how technology can help amputees

Virtual Reality Used To Treat Mental Health Problems


And reading

Haptics for Human-Computer Interaction: From the Skin to the Brain

Interpersonal Haptic Communication: Review and Directions for the Future

The Effect of Multimodal Virtual Reality Experience on the Emotional Responses Related to Injections

Walking on Paintings: Assessment of passive haptic feedback to enhance the immersive experience

What the Mind Can Comprehend from a Single Touch