Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.
During these two weeks, with the guidance of Leo and the help of Lidia, I completed all the work on “Anger” and “Bargaining” and the ambient work of “False Acceptance” in Wwise, and created Blend container events to gradually control these ambient sounds. I also created triggers in Unity to test these sounds. After testing, these sounds can run correctly, which is a good preparation for the next stage of assembly work!
Leo gave me a very useful suggestion on the Anger ambient. He asked me to create a blend container under the blend container to control my two anger ambients. This suggestion made my ambient effect even stronger.
This week I have created a test model in Unity and am preparing to assemble Lidia’s Wwise project files and test them. Next week we will conduct the first test of the entire project in Atrium.
In the fifth and sixth weeks, my main responsibility is to record the sound, create the sound for the second time, and create the Wwise project.
In the five stages of the wwise project, my work at this stage is to complete all the ambient and SFX work of “Anger” and “Bargaining”. It is expected to complete the ambient construction of “False Acceptance” in week 7-8. This part of the SFX is produced by Lidia.
1.Anger Stage
According to Chunyu’s video, I first use the original audio file I recorded on AU to produce ambient sound, and then make sound effects after making ambient sound, and add them to Wwise.
In the ambient construction of Anger, I added the warning sound and the metal tearing sound recorded with an electromagnetic microphone, and added pitch converter, low-pass filter, and smaller reverberation to create a feeling of unease and anger.
In the sound effect part, I chose some high-frequency sounds to enhance the feeling of anger.
2.Bargaining Stage
At this stage, I mixed the sound of rain recorded at submission1 and the sound of knocking on the piano lid to make environmental sounds, and made them standardised and added some effects.
In terms of sound design, I have combined most of the sound effects on Reaper to create a feeling of “bargaining”.
Sound not only shapes emotions but also plays a crucial role in guiding immersive experiences. This project, centered on the concept of “Denial,” explores how sound represents the five stages of negative emotions and enhances emotional resonance through interactive installations. To establish a solid theoretical foundation for sound design, I conducted extensive research on relevant literature and case studies, analyzing how sound influences emotions, the key elements of immersive sound design, and the integration of interactive technology. Through the fusion of sound and visuals, this project aims to create a profound emotional journey for the audience.
2.Theoretical Background
2-1 Denial
Our project is based on the concept of “Denial,” commonly associated with Kübler-Ross’s (1969) Five Stages of Grief, introduced in On Death and Dying. This model describes the emotional responses individuals experience when facing death or major life changes, progressing through denial, anger, bargaining, depression, and acceptance. Denial is more than just temporary avoidance; it serves as a psychological defense mechanism that helps individuals cope with overwhelming emotional shock.
Stage
Overview
Emotional Manifestation
Denial
Rejecting reality, avoiding painful truths.
Numbness, indifference, refusal to acknowledge or face reality.
Anger
Feeling frustration and resentment when unable to escape reality.
Hostility, irritability, possible aggressive behavior.
Bargaining
Attempting to reduce pain through negotiation or compromise.
Self-consolation, hoping to change the outcome through “deals.”
Depression
Recognizing the unchangeable reality, leading to sadness and helplessness.
Loneliness, despair, loss of motivation and interest.
Acceptance
Ultimately accepting reality and facing the future with a calm mindset.
Inner peace, relief, gradually adapting to change.
These five stages describe the emotional changes individuals experience when facing significant loss or trauma. Psychologist Anna Freud (1936) identified denial as a primitive yet common defense mechanism that allows individuals to temporarily escape reality when emotions become overwhelming, helping to reduce psychological stress. When a person struggles to accept inevitable loss or change, denial may persist, manifesting as avoidance of facts or self-soothing to maintain a false sense of reality. Immersive sound installations can enhance this emotional experience, allowing individuals to more intuitively perceive and explore denial and its psychological impact.
2-2 Sound
Huang and Wu (2007) found a strong correlation between music selection and emotional responses. The impact of sound on emotions is a multidimensional process influenced by key factors such as pitch, intensity, rhythm, and sharpness. Research indicates that high-frequency noise and sharp sounds can trigger stress responses, whereas low-frequency vibrations may induce a sense of calmness or suppression (HEAD acoustics, n.d.). Additionally, fast and irregular rhythms are often linked to anxiety, while dissonant intervals—such as minor seconds, augmented fourths/diminished fifths, and major sevenths—stimulate the amygdala, a brain region responsible for processing emotions, particularly fear and distress (Pankovski, 2023). Musical scales also play a crucial role in emotional expression; major scales are generally associated with positive emotions, whereas minor scales tend to evoke negative feelings. Moreover, slow-tempo music is often linked to sadness (Sun, Liu, & Nan, 2009).
In terms of sound design, different frequencies elicit distinct emotional experiences. Low-frequency sounds (20 Hz to 250 Hz) can create physical resonance, which in turn provokes anxiety and fear. For instance, the low-frequency vibrations of thunder and earthquakes are commonly associated with danger and threat, intensifying feelings of unease. In contrast, high-frequency sounds (2,000 Hz to 20,000 Hz) are highly stimulating, heightening alertness and inducing emotions such as anger and anxiety (Wemore8888, n.d.). Additionally, high volume, fast-paced, and irregular rhythms can further amplify tension or aggression, making sound a powerful tool in shaping psychological experiences.
These findings highlight the critical role of sound characteristics in emotional modulation. Variations in pitch, musical scale, rhythm, and frequency can direct and enhance different psychological experiences, effectively influencing the emotional state of an audience in immersive environments.
In The Last of Us, the protagonist, Joel, experiences deep emotional trauma after losing his daughter. The game uses low, slow-paced music and environmental sound effects to create an atmosphere of oppression and solitude.
Composed in 2001, The Disintegration Loops by William Basinski was created when he attempted to digitize old tape loops and discovered that they gradually deteriorated due to aging. He recorded this process, capturing a sense of slow decay and irreversible tragedy, while simultaneously evoking a meditative tranquility and transcendence.
Created by Mexican artist Rafael Lozano-Hemmer, Voice Tunnel is a large-scale interactive installation that allows participants to engage through sound. As visitors vocalize into the central sound system of the tunnel, the installation responds in real-time by adjusting the brightness and flickering patterns of lights based on the intensity and characteristics of the sound. This interplay between sound and light creates a unique and immersive sensory experience.