This video documents part of our development process, team meetings, how we set up the exhibition for Presentation Day, and interviews and feedback from players.
(This video was co-edited by Ruxin, Linteng, Roger, Zhaoyi)
This video documents part of our development process, team meetings, how we set up the exhibition for Presentation Day, and interviews and feedback from players.
(This video was co-edited by Ruxin, Linteng, Roger, Zhaoyi)
Yunjia Chen and I were in charge of the sound design part. After communicating with the room designers, we made the following table.
In the design of the basic sound, we hope to use a sense of unreality to express Alzheimer’s disease. For example, Sonic Detector and Radar, we used synth patches and reversed playback. In the sound design of each room, we communicated with each room designer and divided it into ambient sound and sound effects.
For the UI sound design, we continue the traditional non-realistic sound and make different sound designs based on the same style.
For the sound of the locator that explores the position of the room, we hope to use a sound that can be looped and not annoying, and this sound can also be distinguished in frequency from music and ambient sound.
You can audition some sound design:
Sonic Detector
Picture Appears/Disappears
Highlight Rubik’s Cube
We designed it based on ambient sound and sound effects, and recorded footsteps in different materials. In some rooms, we have made special designs, such as transfer Door.
Transfer Door
Also some items have more detailed sounds:
Microscope
In the original idea, we hoped to switch different ambient sounds in the classroom room according to the items we got every time, and made three kinds of sounds: exam, after class (noisy), after school (adding outdoor sound), but in the final game play, we chose after-school sounds to bring players a better gaming experience.
Also, we have a special sound production for dark rooms:
The dark room collapses after being triggered
For more details, refer to声音设计
Narrative
“I” seemed to have arrived in a non-Euclidean labyrinth and couldn’t remember why I was there. I switched on the radio in the room and followed the instructions to complete the first room, at which point multiple doors opened and different choices took me back to different stages of my memory, and I needed to explore each room in the labyrinth to retrieve the pieces belonging to each stage of my life and repair my complete memory.
1. Once the player has completed the room assignment, is it possible to add subtitles or narration to connect Erikson’s stages of psychosocial development more closely to the content of the room.
With the workshop, we can achieve some interesting effects with max and apply them to our projections.
1. Overlaying the player’s real-time play footage with a gradually changing puzzle video to give the viewer a differentiated experience from the player.
Implementation solution: capturing real-time game footage via obs.
Inspired by https://www.adceurope.org/awards/annual/remember-me_729
2.The projection plays our recorded gameplay video and can make certain effects based on the player’s or viewer’s movements and sounds. The content of the projection is like a parallel world to the player’s gameplay world. Such as RGB video effects : https://youtu.be/sO5NaTjBvL8
With Steam Audio, we’ll be able to integrate both environment simulation and listener simulation into one.
Integrating Spatial Audio in Unity,
https://www.audiokinetic.com/en/library/edge/?source=Unity&id=pg_spatialaudio.html