On the intriguing association between sounds and colours

It seems three main types of crossmodal correspondences between the senses exist: transfer of information, shared associations, and subjective associations (see our blog for the crossmodal correspondences between the senses).


In this blog post, I have invited Researcher Nicola Di Stefano, Institute of Cognitive Sciences and Technologies, National Research Council of Italy to explain the subjective associations between music and colour. Nicola Di Stefano has contributed numerous publications on both the philosophy and psychology of perception and the aesthetics and psychology of music.


Sounds and colours are two distinct sensory experiences that convey different information about the environment we inhabit. While we typically attribute a colour to every object we perceive, we wouldn’t assert that each object possesses or is inherently associated with a particular sound. Of course, musical instruments produce sounds, and various objects can emit sounds, like hammers, rocks, and sticks, but sound seems to be an ontologically different, namely less foundational, feature of objects compared to colours.


Interestingly, however, intellectuals, researchers, artists, and composers have been long fascinated by the association between those two seemingly radically different sensory experiences. Their idiosyncratic association is evident in sound-colour synaesthesia, one of the most prevalent forms of synaesthesia, a rare neurological phenomenon where stimulation of one sensory (or cognitive) pathway leads to automatic, involuntary experiences in another1. This cross-wiring allows individuals with synaesthesia to experience a unique blending of sensations, such as seeing colours in response to musical notes or chords. Sound-colour synaesthesia has inspired several artworks, including the musical paintings by Kandinsky and Scriabin’s Prometheus, a composition based on the combination of coloured lights and music.

Coloured lights on a board in Scriabin's house

 




Photo retrieved from: Scriabin and the Possible


Psychologists have also explored the mechanisms underlying the consistent association between certain features of sounds and colours in non-synaesthetes. The concept of “crossmodal correspondence” suggests that certain sensory attributes share underlying perceptual or cognitive processes, leading to associations between them2. For instance, studies have revealed that people tend to associate high-pitched sounds with light or bright colours, while low-pitched sounds are often linked to dark colours3. These associations may arise from shared perceptual features, such as the frequency or intensity of auditory and visual stimuli.


One of the most intuitive ways to explain sound-colour correspondences is psychophysical, suggesting that both sounds and colours are vibratory phenomena. However, the sensory systems that process the two signals are quite different, making it challenging to establish a link between sounds and colours based solely on alleged psychophysical similarity. Additionally, an important distinction lies in the octave similarity in music, where sounds at different frequencies (integer multiples of the same fundamental frequency) share the same pitch class (e.g., “D”), whereas in the domain of colour, there is no equivalent octave repetition.


Furthermore, philosophers grapple with the metaphysical implications of the interplay between music and colour. Music, often described as the “language of the emotions” elicits powerful affective responses in listeners, shaping their emotional landscapes4-5. Similarly, colour possesses symbolic and emotional resonance, evoking mood and atmosphere in visual art and design. The intentional combination of music and colour in multimedia art forms, such as film and digital media, underscores the transformative potential of blending sensory modalities to create immersive experiences.


Whether through the lens of synaesthesia, crossmodal correspondence, or aesthetic inquiry, the convergence of music and colour illuminates the intricate interplay between sensory perception, cognition, and emotion. By unravelling the mysteries of this symbiotic relationship, researchers, artists, and practitioners aim to gain deeper insights into the nature of human experience and the profound ways in which art shapes our understanding of the world.


See our blog for Activities; especially 25-27.


Some suggestions for further listening and watching:

Artists use synesthesia to expand their creative limits

Elements of Music

Introduction to Color

Is Your Red The Same as My Red?

Light Organ (Clavière a lumiére) – Scriabin op 65 no 2

Seeing Sound: How Synesthesia Can Change Our Thinking

Seeing song through the ears of a synesthete

Synesthesia & creating your own score

 

_______________

1Ramachandran, V. S., & Hubbard, E. M. (2001). Synaesthesia–a window into perception, thought and language. Journal of consciousness studies, 8(12), 3-34.

2Spence, C. (2011). Crossmodal correspondences: A tutorial review. Attention, Perception, & Psychophysics, 73(4), 971-995.

3Spence, C., & Di Stefano, N. (2022). Coloured hearing, colour music, colour organs, and the search for perceptually meaningful correspondences between colour and sound. i-Perception, 13(3), https://doi.org/10.1177/20416695221092802

4Cooke, D. (1959). The language of music. London: OUP.

5Juslin, P. N., & Sloboda, J. (2011). Handbook of music and emotion: Theory, research, applications. Oxford: Oxford University Press.

Crossmodal brain plasticity and empowering of sensory abilities

Research on crossmodal brain plasticity has not only found that the brain compensates for sensory impairments. For example, so that people who are born blind process auditory information in both the auditory and the visual areas – not just the auditory like the fully sighted. (See our blog for the scientific approach.) It has also shown that the brain adapts to artificial input restoring the impaired senses and computer algorithms translating information from one sense to another. However, while people automatically recognise information processed through their natural crossmodal correspondences (see our blog for the crossmodal correspondences between the senses), they have to learn to learn to interpret the sensations from both brain implants for hearing and sensory substitution devices from vision to hearing (see our blog for A Feel for Art.)


I have invited Carina Sabourin, Yaser Merrikhi, and Stephen G. Lomber, Cerebral Systems Laboratory, McGill University to write this blog post about crossmodal brain plasticity and empowering of sensory abilities. Carina Sabourin, Yaser Merrikhi, and Stephen G. Lomber investigate cortical plasticity in the auditory and visual cortices following hearing loss and the initiation of hearing with cochlear prosthetics. And, recently, in a comprehensive review study, they addressed the question “Do the blind hear better?”.


The idea that blind people can compensate for their lack of vision with enhanced hearing or other abilities, has been around for millennia. Many of the most acclaimed artists from the 8th century BC Greek poet, Homer, to the great jazz musician, Stevie Wonder, lost their vision. Recently, researchers have investigated these anecdotes and confirmed that there is now over-whelming evidence the blind have specific super hearing abilities compared to the sighted1. More excitingly, the brains of blind individuals recruit neural areas that typically handle vision to process auditory information along with hearing brain areas. The ability of typically visual areas to adapt to auditory input is called crossmodal plasticity. The extra brain power crossmodal plasticity provides gives blind individuals their superhuman hearing abilities.


Crossmodal plasticity can occur for other senses beyond hearing too. Deaf individuals recruit hearing brain areas to improve their vision2. Beyond other senses taking advantage of the freed-up brain power, brain plasticity can help the brain adapt to artificial input from brain implants restoring the lost sense. One example is cochlear implants which bypass the inner ear and directly stimulate the auditory nerve giving people with certain types of hearing loss access to sounds. Crossmodal plasticity is thought to help visual and hearing brain areas work together to better process speech. The more teamwork between visual and hearing brain areas, the better cochlear implant users can understand speech3. Similarly, researchers and engineers developing tools for blind people can leverage brain plasticity as well as the specific super hearing abilities of the blind.


One such attempt is sensory substitution devices (SSD) which translate information from one sensory modality into another. Audio-to-visual SSDs convert visual scenes captured by a camera into soundscapes. These devices exploit the improved pitch discrimination4-5 and sound localization abilities6-8 of blind people to convey information about visual environments as the frequency and movement of sounds. SSDs can even use the available brain space in the visual cortex. The part of the visual cortex that recognizes human bodies and tracks their movement was recruited to localize body movement conveyed by an SSD9. The visual reading brain area was even activated by SSDs to enable blind individuals to read with sounds10. Even years after getting their vision back, the visual cortex of individuals who sight was restored through gene therapy was still helping hearing brain areas process sounds11. Some concern exists that crossmodal plasticity may hinder sight restoration from visual brain implants. However, brain plasticity may help the visual and hearing brain areas work together to improve vision outcomes for visual brain implant users, just like it improved the ability of cochlear implant users to understand speech3.


The additional brain power provided by crossmodal plasticity empowers blind individuals with their extraordinary hearing abilities. Researchers and engineers creating tools for the blind can leverage both brain plasticity and their remarkable auditory skills to improve how blind individuals navigate and interact with the world around them.


See our blog for Activities; especially 22-24.


Some suggestions for further listening and watching:

Healing the brain via multisensory technologies and using these to better understand the brain

Losing and recovering sight

Neuroplasticity Animation

The Brain

What Does Blindness or Deafness Tell Us About Brain Development?

What is the function of auditory cortex when it develops in the absence of acoustic input?

 

_______________

1Sabourin, C. J., Merrikhi, Y., & Lomber, S. G. (2022). Do blind people hear better? Trends in Cognitive Sciences, 26(11), 999-1012. https://doi.org/10.1016/j.tics.2022.08.016

2Bavelier, D., Dye, M. W. G., & Hauser, P. C. (2006). Do deaf individuals see better? Trends in Cognitive Sciences, 10(11), 512-518. https://doi.org/10.1016/j.tics.2006.09.006

3Anderson, C. A., Wiggins, I. M., Kitterick, P. T., & Hartley, D. E. H. (2017). Adaptive benefit of cross-modal plasticity following cochlear implantation in deaf adults. Proceedings of the National Academy of Sciences of the United States of America, 114(38), 10256-10261. https://doi.org/10.1073/pnas.1704785114

4Collignon, O., Dormal, G., Albouy, G., Vandewalle, G., Voss, P., Phillips, C., & Lepore, F. (2013). Impact of blindness onset on the functional organization and the connectivity of the occipital cortex. Brain, 136(9), 2769-2783. https://doi.org/10.1093/brain/awt176

5Rokem, A., & Ahissar, M. (2009). Interactions of cognitive and auditory abilities in congenitally blind individuals. Neuropsychologia, 47(3), 843-848. https://doi.org/10.1016/j.neuropsychologia.2008.12.017

6Chen, Q., Zhang, M., & Zhou, X. (2006). Spatial and nonspatial peripheral auditory processing in congenitally blind people. NeuroReport, 17(13), 1449-1452. https://doi.org/10.1097/01.wnr.0000233103.51149.52

7Lewald, J. (2013). Exceptional ability of blind humans to hear sound motion: Implications for the emergence of auditory space. Neuropsychologia, 51(1), 181-186.

https://doi.org/10.1016/j.neuropsychologia.2012.11.017

8Röder, B., Teder-Salejarvi, W., Sterr, A., Rosler, F., Hillyard, S. A., & Neville, H. J. (1999). Improved auditory spatial tuning in blind humans. Nature, 400(6740), 163-166.

9Striem-Amit, E., & Amedi, A. (2014). Visual Cortex Extrastriate Body-Selective Area Activation in Congenitally Blind People “Seeing” by Using Sounds. Current Biology, 24(6), 687-692. https://doi.org/10.1016/j.cub.2014.02.010

10Striem-Amit, E., Cohen, L., Dehaene, S., & Amedi, A. (2012). Reading with Sounds: Sensory Substitution Selectively Activates the Visual Word Form Area in the Blind. Neuron, 76(3), 640-652. https://doi.org/10.1016/j.neuron.2012.08.026

11Mowad, T. G., Willett, A. E., Mahmoudian, M., Lipin, M., Heinecke, A., Maguire, A. M., Bennett, J., & Ashtari, M. (2020). Compensatory Cross-Modal Plasticity Persists After Sight Restoration. Frontiers in Neuroscience, 14(12 May). https://doi.org/10.3389/fnins.2020.00291