Designing and creating interaction data calculations and sound and visual interaction logic fell under my role in the development process. In an iterative collaborative process with my teammates, we created an exhibition experience that was multi-sensory, visual, auditory, and interactive.
3/3/2025 – 7/3/2025 | Tidal Force Data Calculation and Logic Design 1
I started by creating the data model because this project relied upon tidal forces as its premise. The formula that I used was:
![]()
The two variables considered were the mass of the planets and the distance from Earth. In order to make this a reality in Max/MSP, I created a customized patch to generate tidal forces in real time.
I first used the rough volume and weight of Mars, the Moon, and Earth to simulate three planets and calculate total tidal force (FR) and direction. I then assigned all planets identical masses to harmonize their sound interaction.
22/3/2025 | Tidal Force Data Calculation and Logic Design 2
I added an extra variable to generate ocean sounds based on planetary motion: movement speed. The patch reads in every 200ms the past and present distance from Earth to another planet, computes their differences, and then divides by interval to approximate speed.
15/3/2025 | Planet Sound Interaction 1
This speed data also came in handy for sound effect triggers such as planet movement sound effects. To aid in simulating immersion, I included a sound effect that, when moving a planet, would trigger a “whoosh” sound. I reused the same speed detection method again (every 200ms). When speed exceeds 0, the sound starts fading in; when speed is reduced to 0, the sound starts fading out immediately. This created a natural, intuitive, motion-based sound effect.
20/3/2025 | Planet Sound Inter
action 2
Fraser then devised a second interaction idea: directly correlating all of the planets’ volumes to the value of tidal force. The group discussed both options and agreed to use Fraser’s “greater force = greater sound” idea. It felt more intuitive and worked more effectively in the exhibition.
25/3/2025 | Ocean Wave Sound Interaction
Following speed calculation, I created a three-phase sound interaction with ocean waves. There is a background sound of waves of low volume at zero speed. The sound of waves becomes more potent as speed increases over the first threshold, coming in more clearly. Upon crossing the second threshold, full-volume waves are heard, producing a powerful audio effect.
25/3/2025 | Visual Particle Interaction 1
This system correctly represented how the user interacted with the planets and facilitated the immersive ocean setting. After completing the initial patch by the visual team that could handle particle motion and RGB-based colour, I added to their work by correlating particle action with tidal force. Particles move quickly under high tidal force and clump together. Particles disperse under low force and move very slowly.
I also tried to symbolize different planets using particle colours – red for fire, white for air, and black for metal. RGB blending, however, turned out to be quite challenging in achieving gradient colours from white to black, so I gave this idea up. I tried to use red, blue, and yellow to differentiate among the three planets later. Even though transitions were more technically sound, usability test results indicated that participants did not understand the colours to symbolize.
1/4/2025 | Visual Particle Interaction 2
As red-blue-yellow was not intuitive to the audience, I also redesigned colour interaction to make them more intuitive. In the eventual release, particle colour represents tidal force strength: weak force = white; stronger force = dark blue. The simple design was much more intuitive for the audience during the exhibition.
2/4/2025 | System Integration and Arduino Testing
As a finishing touch, I combined all elements of interaction. Physical Arduino sensor inputs were routed to Max/MSP. Tidal force and speed were computed and translated to ocean sounds, planetary sound effects, and particle system visualization.
This concluded the entire cycle of interaction: physical action → data input →instant calculation, → audio/visual feedback. It caused the entire system to work fluidly and presented a completely immersive experience for the final performance.