Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Experimentation: Water Wave Projection

Wave Projection Experiment – 25/02/25 – By Fraser Macdonald

Video:

Setup: 3M Overhead Projector with clear acetate basin, filled with water, projecting onto white wall.

3M Projector for water projection experiment
3M Projector for water projection experiment
Acetate Basin for Water Projection Experiment
Acetate Basin for Water Projection Experiment

Photos:

Water projection with wave
Water projection 2
Water projection with waves and droplets
Water Projection 1

Conclusion: Interacting with the water and seeing the peaks of the waves projected onto the wall in real time was engaging and satisfying. I also enjoyed observe the interaction between the peaks, and the way the square basin shapes the waves. When the water is initially poured into the basin, bubbles of air are trapped on the surface, and when they pop, generate circular ripples in an unexpected and very satisfying fashion. A downside to the 3M projector is it is very bright and straining on the eyes to use for prolonged periods of time (I should have worn sunglasses).

Follow up questions: Is there a way to project this downwards on to the floor as in the design? Should use actual water rather than acetate imitating water? How would different basin sizes affect the interaction between the waves?

Next experiment: Wave generation via contact speaker on basin.

Engine Building: Music Generation MAX PATCH

This is a demo max patch that takes the data from three planets, generating MIDI notes in set rhythmic and bar patterns, moving through seven modes, with seven chord possibilities for each mode.

 

The following video shows the integrated atmosphere engine, made with samples from Shu and Fulong, working within the wider MIDI engine:

Atmosphere Screen Grab DMSP

 

by Fraser David Macdonald for DMSP Perception

Music and Atmosphere Tracks: Earth, Fire, Gas, Metal

Music & Soundscape

Each track represents a different orb, combining a generated musical element with atmospheric planetary sounds. Each planet has three VST instruments that are receiving MIDI from MAX, generated algorithmically in response to user interactions with the system.

Full Music Track: generated entirely by audience input through the three sensors into the MAX MIDI machine:

Earth: waves, whale song, the universe atmos:

Metal: rhythmic percussion, stuttering and fizzing metalic screeching:

Lava: distorted cello, fire, bubbles

Gas: wind, low flute and whistle melody

 

Music, Processing, and Mixing: Fraser

Soundscapes & Atmospheres: Shu & Fulong

Summary Notes 2: Design Process and Data Flow

Design Process

We decided on using the Atrium, as it had a projector and metal rig that black drapes could be hung from. Another advantage to using the atrium was the four speakers we could connect to through the house audio system. This decision allowed us to make basic layout diagrams: the first shows lose-up of the central planet and moons, the second zooms out, showing the three drapes and two projectors, and the third featuring side projectors.

Based on our research into planet types, we chose to design a gas giant, a metal orb and a molten lava planet. Each planet would sit on top of a light fastened to a set of 3D printed wheels. Caitlin and Yannis also built need tracks to keep each planet still when it was not being moved by and audience member.

A close-up of a white sphere AI-generated content may be incorrect.

A collage of different objects AI-generated content may be incorrect.

A collage of images of a person holding a red object AI-generated content may be incorrect.

 

Technology: Data Flow

We want the representation of these celestial movements to go beyond just a soundscape. It should also reflect the natural gravitational pull between the Sun and the Moon. Through our interactive music system, we aim to let the audience intuitively feel the tension and fluctuations of gravitational forces, translating these cosmic interactions into an immersive auditory experience.

Our final solution was to connect the ultrasonic sensor using M5StickC Plus. VCC, GND, and SIG pins of the Grove ultrasonic sensor, which transmit the data to 5V, GND, and GPIO 26 of the M5StickC Plus, respectively. The computer connects to each stick via UDP protocol, and a Python script written by Lulu turns the data into OSC compliant data, and then transmits it to the Max/MSP patch.

Within MAX we calculate the tidal forces generated by the audience’s adjustments to the positions of the Sun and Moon. The tidal force is determined using the following formula:

Ft = 2GMR / d3

Where Ft is the tidal force, G is the gravitational constant, M is the mass of the orb, R is the radius of the Earth, and d is the distance of the orb to the Earth. We map these tidal forces to various dimensions: the volume of the atmosphere for each orb increases as distance decreases, wave sounds are triggered based on the rate of change of the distance of any orb, and the resultant gravitational force and vector are calculated for the musical generation side of the MAX patch.

 

Summary notes by Fraser David Macdonald.

Summary Notes: Abstract, Research & Development

Abstract

Over the past century, technologies and visualisations have been combined by artists and scientists, to explain their discoveries and theories, and to educate and enhance our knowledge of the universe

Through this journey the public have seen diagrams and drawings of the known universe, then photographs of our solar system, eventually of far planets and solar systems, and most recently, images of black holes, renders of gravitational waves, and pictures of the furthest corners of the galaxy. The image on the opening page is the galaxy LEDA 48062 in the constellation Perseus, 30 million light-years from Earth, and its neighbour galaxy UGC 8603, and the image on the contents page is part of the Eagle Nebula, 7000 light-years away. Both images are visualisations of data as much as they are photographs: infra-red and ultraviolet light is brought into the visual spectrum, and interesting features are enhanced, to further our understanding and experience of the image, transforming the way in which we perceive and evaluate our universe and role within it.

Across the creative industry artists are looking to science, data, and mathematics to inspire their creations, working to make esoteric and complex discoveries more accessible and understandable. This project seeks to explore this boundary between the arts and academia, personal and universal, and to produce a piece of work illuminating the magnitude of space and exploring what lies beyond human perception. Our immersive installation, ‘The Breath of the Machine’ allows the audience to control the form of their universe, immersing them in heat, gas, and metal, with visual projections and reactive soundscapes, changing with the gravitational tidal force. By engaging with the ebb and flow of tidal forces, they can experience the tension, release, and rhythmic variations driven by these celestial interactions.

Research

Our starting point for this project was perception, and as we considered the relationship between sound, time, and scale, we found ourselves naturally drawn to cosmic ideas of moon and tides, planets and gravity.  Then we used these abstract ideas to imagine an immersive experience, and how we might engage with light, colour, sound, and texture to represent these ideas. As individuals we researched around these themes, before combining ideas as a group to form the scope for our final project. To explore perception of light and darkness, we considered an installation with moving projectors, that would lengthen and shorten the shadows of objects and of the audience. We also considered an interactive wall, with chaotic ripples, and reactions of colour and sound. Our cosmic ideas led us to consider the moon, our tides, and how we can perceive waves, and water.

Nature and nurture, combined with active factors including distraction and attention, emotion and mood, and our immediate surroundings, affect how we perceive the world around us, changing how we behave, and even the rate at which we perceive time to be passing. Author and philosopher Bruce Kreiger compares the desire to understand our self and our biology with our exploration of the universe and cosmological theories, writing that the process of understanding how any one of these mechanisms functions, deepens our understanding of the entire system, and changes our interaction with the world around us.

‘This reminds us that our understanding of the universe is not a direct reflection of reality but rather the result of a complex process of interpreting data through the prism of our sensory systems and cognitive structures.’

– Bruce Kriger: ‘The Limits of Perception’

By exploring the universe, we have changed our sense of self, placing ourselves within a solar system in a galaxy in the cosmos. The scale of the universe is so massive that we cannot perceive it in the same way as we do with regular distances. We know and can visualise mentally the size of one centimetre, or inch. The same can be said for a foot or meter, kilometre or mile, however if one tries to picture 10,000km or miles, it becomes much harder to visualise. When we try to imagine a universe that is 93 billion light-years wide, it feels ‘out-with’ the realms of scale: on the border between finite and infinite. It can be a humbling experience to learn more about the universe and our place within it, but it can also be a source of great awe and inspiration.

‘The reality of the universe exists prior to and independent of its being perceived, but when it is perceived, the limits of that finite mind confer an appearance on that reality.’

– Rupert Spira: ‘The Reality of the Universe Is Prior to Perception’

Contemporary Projects

We were inspired by the work of contemporary across a wide range of practices. ‘Can’t help myself, is a very emotional work, involving a robot arm, attempting to cleaning red paint that keeps spilling. Created by Sun Yuan and Peng Yu, this installation breaks down over time, and as the room gets messier the sense of struggle coming from the robotic system increases and increases. ‘The Clock’ by Christian Marclay is a 24-hour-long montage of thousands of film and television images of clocks, telling the time through a collage of cinematic history. The viewer experiences a vast range of narratives, settings and moods within the space of a few minutes. Similarly, Hannah Lees explored ideas of cycles, constancy, and mortality in her work ‘The Passage of Time’.

Development

Having explored this research and discussed these and other installations and works as a group, we formulated a plan: to create an interactive installation where the audience can engage with universal forces, altering the visuals, music, and soundscapes around them as the tidal forces change. We would need movement sensors, objects or sculptures for the audience to interact with, and a MAX patch to generate the atmosphere with music and visuals in response to the data. We set about making a fictional planetary system. A central water-based planet, with three moons in orbit, the distance of each to the centre we could measure using infra-red distance sensors.

We split into three teams; Design, who made the sculptures and installation visuals, Data, who focused on connecting the infra-red distance sensors through an Arduino into MAX, and Audio, who focused on the sound-design and music. Each team had a specific brief and focus. While Shu designed a sound-effect for each planet: metal, gas, magma, and Fulong designed a responsive wave system, Fraser build a midi-generator in MAX, that would trigger melodies and accompaniment, alongside controlling the overall atmosphere and soundscape in response to the distance data coming from each planet. Meihui wrote the logical MAX sub-patcher that would calculate the total tidal force exerted on the middle planet by the three moon planets, and the resultant direction that force was acting in. With the data flowing into MAX, thanks to the Arduino setup protocol and wiring by Lulu, the final pieces to the installation development were to make the interactive objects, the visual projections, and decide how they would be organised in the room.

Summary notes written by Fraser David Macdonald.

 

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel