https://1drv.ms/b/c/4d1197ded0e122fd/ETkNzwZ1DutArfkXHp7XRgYBhmRvRdpcUKo5wQ-xY5iEKw?e=zYbGSq
Category: Uncategorised
Video Script: Project Documentary Concept
https://1drv.ms/b/c/4d1197ded0e122fd/ETHz8oBW_61HkvlHX_L8xycBoOlgVl_Nt2eWzlaeZNgwMQ?e=3fpSN0
Final Project
Final video:
Documentary video report:
Resources:
-
-
Visuals:
Touchdesigner + Video development:
Physical model development:
Max Jitter:
-
-
-
Arduino and Data Calculation:
-
Music:
-
-
-
Sound design:
Universe↑
Air:
Fire:
Metal:
Video Sound Design (Part):
Final Interaction:
-
Touchdesigner video instructions – Yanis
Based on our concept and with the help of Caitlin and Pradyumna, I created a Touchdesigner visual based on a YouTube tutorial (https://www.youtube.com/watch?v=dzYyrvyY-zc&t=1687s) and adjusted the effects and colors to our project. We were trying to substitute the formulas that the team projected into Touchdesigner to use, but it didn’t work out in the end. However, the attractor used in this visual matched the concept of our project so well that we decided to keep the visual and make an ambient video to place in the room.


Final presentation video
2025.4.20-4.23
We started the video editing work at the end. We divided the video into three parts to show our project:
The first part is the introduction of the project theme
The second part is the introduction of the installation, from the basic design logic to the technical principles used in each part.
The third part is a detailed introduction from the three aspects of sound design, visual design, and music design, and tells about their mutual influence

On-site shooting:Fulong Yang & Shu Zhang
Post-production editing:Lulu Huang & Fulong Yang
Music:Fresar Macd
Preparation for the final video
2025.4.3-2025.4.5
After the meeting, we finally decided to submit it in the form of a video. I was responsible for the production process, the on-site shooting tasks and the post-editing video.
Preliminary preparation: Recording from the final stage of preparation and rehearsal, the progress of each rehearsal and the rehearsal process.
On-site shooting: The photography equipment is a SLR camera and a mobile phone, equipped with a recorder and microphone. The SLR camera is used to capture close-ups of tourists using the equipment on site and interviews with tourists. The mobile phone uses a fixed position for time-lapse photography to record the overall situation on site, from the layout of the site to the start of the event.
Photos from the rehearsal:




Summary Notes 4: Reflections and References
Summary Notes 3: Visual Projections, Soundscapes, and Music Generation
Summary Notes 2: Design Process and Data Flow
Design Process
We decided on using the Atrium, as it had a projector and metal rig that black drapes could be hung from. Another advantage to using the atrium was the four speakers we could connect to through the house audio system. This decision allowed us to make basic layout diagrams: the first shows lose-up of the central planet and moons, the second zooms out, showing the three drapes and two projectors, and the third featuring side projectors.
Based on our research into planet types, we chose to design a gas giant, a metal orb and a molten lava planet. Each planet would sit on top of a light fastened to a set of 3D printed wheels. Caitlin and Yannis also built need tracks to keep each planet still when it was not being moved by and audience member.
Technology: Data Flow
We want the representation of these celestial movements to go beyond just a soundscape. It should also reflect the natural gravitational pull between the Sun and the Moon. Through our interactive music system, we aim to let the audience intuitively feel the tension and fluctuations of gravitational forces, translating these cosmic interactions into an immersive auditory experience.
Our final solution was to connect the ultrasonic sensor using M5StickC Plus. VCC, GND, and SIG pins of the Grove ultrasonic sensor, which transmit the data to 5V, GND, and GPIO 26 of the M5StickC Plus, respectively. The computer connects to each stick via UDP protocol, and a Python script written by Lulu turns the data into OSC compliant data, and then transmits it to the Max/MSP patch.
Within MAX we calculate the tidal forces generated by the audience’s adjustments to the positions of the Sun and Moon. The tidal force is determined using the following formula:
Ft = 2GMR / d3
Where Ft is the tidal force, G is the gravitational constant, M is the mass of the orb, R is the radius of the Earth, and d is the distance of the orb to the Earth. We map these tidal forces to various dimensions: the volume of the atmosphere for each orb increases as distance decreases, wave sounds are triggered based on the rate of change of the distance of any orb, and the resultant gravitational force and vector are calculated for the musical generation side of the MAX patch.
Summary notes by Fraser David Macdonald.
Summary Notes: Abstract, Research & Development
Abstract
Over the past century, technologies and visualisations have been combined by artists and scientists, to explain their discoveries and theories, and to educate and enhance our knowledge of the universe
Through this journey the public have seen diagrams and drawings of the known universe, then photographs of our solar system, eventually of far planets and solar systems, and most recently, images of black holes, renders of gravitational waves, and pictures of the furthest corners of the galaxy. The image on the opening page is the galaxy LEDA 48062 in the constellation Perseus, 30 million light-years from Earth, and its neighbour galaxy UGC 8603, and the image on the contents page is part of the Eagle Nebula, 7000 light-years away. Both images are visualisations of data as much as they are photographs: infra-red and ultraviolet light is brought into the visual spectrum, and interesting features are enhanced, to further our understanding and experience of the image, transforming the way in which we perceive and evaluate our universe and role within it.
Across the creative industry artists are looking to science, data, and mathematics to inspire their creations, working to make esoteric and complex discoveries more accessible and understandable. This project seeks to explore this boundary between the arts and academia, personal and universal, and to produce a piece of work illuminating the magnitude of space and exploring what lies beyond human perception. Our immersive installation, ‘The Breath of the Machine’ allows the audience to control the form of their universe, immersing them in heat, gas, and metal, with visual projections and reactive soundscapes, changing with the gravitational tidal force. By engaging with the ebb and flow of tidal forces, they can experience the tension, release, and rhythmic variations driven by these celestial interactions.
Research
Our starting point for this project was perception, and as we considered the relationship between sound, time, and scale, we found ourselves naturally drawn to cosmic ideas of moon and tides, planets and gravity. Then we used these abstract ideas to imagine an immersive experience, and how we might engage with light, colour, sound, and texture to represent these ideas. As individuals we researched around these themes, before combining ideas as a group to form the scope for our final project. To explore perception of light and darkness, we considered an installation with moving projectors, that would lengthen and shorten the shadows of objects and of the audience. We also considered an interactive wall, with chaotic ripples, and reactions of colour and sound. Our cosmic ideas led us to consider the moon, our tides, and how we can perceive waves, and water.
Nature and nurture, combined with active factors including distraction and attention, emotion and mood, and our immediate surroundings, affect how we perceive the world around us, changing how we behave, and even the rate at which we perceive time to be passing. Author and philosopher Bruce Kreiger compares the desire to understand our self and our biology with our exploration of the universe and cosmological theories, writing that the process of understanding how any one of these mechanisms functions, deepens our understanding of the entire system, and changes our interaction with the world around us.
‘This reminds us that our understanding of the universe is not a direct reflection of reality but rather the result of a complex process of interpreting data through the prism of our sensory systems and cognitive structures.’
– Bruce Kriger: ‘The Limits of Perception’
By exploring the universe, we have changed our sense of self, placing ourselves within a solar system in a galaxy in the cosmos. The scale of the universe is so massive that we cannot perceive it in the same way as we do with regular distances. We know and can visualise mentally the size of one centimetre, or inch. The same can be said for a foot or meter, kilometre or mile, however if one tries to picture 10,000km or miles, it becomes much harder to visualise. When we try to imagine a universe that is 93 billion light-years wide, it feels ‘out-with’ the realms of scale: on the border between finite and infinite. It can be a humbling experience to learn more about the universe and our place within it, but it can also be a source of great awe and inspiration.
‘The reality of the universe exists prior to and independent of its being perceived, but when it is perceived, the limits of that finite mind confer an appearance on that reality.’
– Rupert Spira: ‘The Reality of the Universe Is Prior to Perception’
Contemporary Projects
We were inspired by the work of contemporary across a wide range of practices. ‘Can’t help myself, is a very emotional work, involving a robot arm, attempting to cleaning red paint that keeps spilling. Created by Sun Yuan and Peng Yu, this installation breaks down over time, and as the room gets messier the sense of struggle coming from the robotic system increases and increases. ‘The Clock’ by Christian Marclay is a 24-hour-long montage of thousands of film and television images of clocks, telling the time through a collage of cinematic history. The viewer experiences a vast range of narratives, settings and moods within the space of a few minutes. Similarly, Hannah Lees explored ideas of cycles, constancy, and mortality in her work ‘The Passage of Time’.
Development
Having explored this research and discussed these and other installations and works as a group, we formulated a plan: to create an interactive installation where the audience can engage with universal forces, altering the visuals, music, and soundscapes around them as the tidal forces change. We would need movement sensors, objects or sculptures for the audience to interact with, and a MAX patch to generate the atmosphere with music and visuals in response to the data. We set about making a fictional planetary system. A central water-based planet, with three moons in orbit, the distance of each to the centre we could measure using infra-red distance sensors.
We split into three teams; Design, who made the sculptures and installation visuals, Data, who focused on connecting the infra-red distance sensors through an Arduino into MAX, and Audio, who focused on the sound-design and music. Each team had a specific brief and focus. While Shu designed a sound-effect for each planet: metal, gas, magma, and Fulong designed a responsive wave system, Fraser build a midi-generator in MAX, that would trigger melodies and accompaniment, alongside controlling the overall atmosphere and soundscape in response to the distance data coming from each planet. Meihui wrote the logical MAX sub-patcher that would calculate the total tidal force exerted on the middle planet by the three moon planets, and the resultant direction that force was acting in. With the data flowing into MAX, thanks to the Arduino setup protocol and wiring by Lulu, the final pieces to the installation development were to make the interactive objects, the visual projections, and decide how they would be organised in the room.
Summary notes written by Fraser David Macdonald.
