Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.
Scene Setup & Filming I invited my friends Ally, Julie, and others to participate in a test and give feedback on the experience. During the session, I also filmed the process and conducted brief interviews to collect their opinions and impressions.
One major takeaway from the feedback was the lack of olfactory (smell) experience, which many felt could enhance the overall immersion and emotional connection of the scene.
I’ve been playing with TouchDesigner recently, exploring ways to connect music and visuals: using the mid and high frequencies of a background track to drive dynamic changes in visuals.
I started by using the Audio Spectrum CHOP to split the audio into different frequency bands. Then I extracted the midand high values as control parameters to manipulate certain layers in the scene—like scaling, position shifting, and even shader-based distortion. The result is a visual that “dances” with the music in real time.
One of my favorite additions is a little boom glow effect—when the high frequencies suddenly spike, it triggers a burst of glowing light in the visuals, like a tiny explosion. It adds a sense of rhythm and punch to the whole piece, emphasizing certain beats and creating a more immersive, concert-like feel.
Next step, I’m thinking about incorporating the low frequencies to control more grounded movements—like global structure wobble or a soft, breathing motion in the background.
In this test, I explored using microphone input to drive real-time visual changes. While the initial interaction worked, I believe the main visuals could be more expressive and exaggerated. For example, the mic input could control a wider range of parameters — such as diffusion, distortion, brightness, and more dramatic rotations — to create a more immersive, reactive effect. I’m open to experimenting with any visual responses that feel dynamic and playful.
Additionally, when I’m not blowing into the mic, the visual decay happens a bit too fast. Slowing down the rate of change when idle could help maintain a smoother and more organic visual flow.
In this iteration, I moved away from the original circular visual form and transitioned to a more organic, irregular shape. This change allowed the visual to feel more fluid and less constrained, resembling something between flowing matter and an energetic field.
Additional enhancements include:
Increased Mic Interactivity: The microphone input now drives multiple parameters simultaneously, including diffusion intensity, mesh distortion, brightness shifts, and rotation amplitude. These changes respond more dramatically to volume spikes, making the interaction more expressive.
Particle Responsiveness: The internal texture now breaks apart and swirls with more visible turbulence, suggesting a kind of sonic turbulence.
Visual Decay Tweaks: The decay time when there’s no input has been lengthened, allowing the form to fade out more gracefully rather than collapsing too fast.
Color Feedback: There is subtle hue shifting based on sound amplitude, helping the piece feel alive and emotionally responsive.
TouchDesigner Audio-Driven Graphics Experiment: From Shape Transformations to Microphone Input Control
Recently, I have been experimenting with TouchDesigner to explore dynamic shape transformations and audio-driven visual effects. Initially, I focused on modifying basic geometric forms using Noise, Output-Derivative (Slope), Threshold, Level (Gamma Adjustment), and Bloom Effects. Later, I integrated microphone input to control shape size and color, using Transform SOP for scaling and translation, with two Null CHOPs managing position and color separately.
1. Initial Shape Transformations and Visual Experiments
① Noise + Output-Derivative (Slope) I started by applying Noise CHOP to introduce organic movement into the shape.To enhance the natural transitions, I used Output-Derivative (Slope CHOP) to smooth out the rate of change, preventing sudden spikes in movement.
② Threshold + Level (Gamma Adjustment) Threshold CHOP was used to create high-contrast effects, transforming smooth gradients into distinct binary patterns.Level CHOP (Gamma Adjustment) helped fine-tune the brightness curve, making the visuals either softer or more dramatic.
③ Bloom Effect Finally, I applied a Bloom effect, enhancing the highlights and adding a glowing aura to the shape, making it more visually engaging.
2. Using Transform SOP for Background, Scaling, and Positioning
To better organize the visuals, I utilized Transform SOP to: Add a background (either static or gradient-based).Control scaling dynamically.Apply translation effects to move the shape across the screen.Link transformations to audio input, so that sound influences both size and position. Additionally, I created two Null CHOPs: One for Position (to control movement based on audio input). One for Color (to change the color dynamically according to the audio intensity).
3. Integrating Microphone Input with Audio Device In CHOP
After experimenting with basic shape transformations, I moved on to controlling these parameters with real-time audio input.
① Capturing Microphone Input Using Audio Device In CHOP, I connected my microphone to feed real-time audio data into TouchDesigner. The raw audio data fluctuates too quickly, so direct mapping would result in erratic visual behavior. To ensure a smooth transformation, I applied additional processing.
② Smoothing Audio Input: Filter CHOP + Lag CHOP .Filter CHOP: Set Filter Width = 2.0 to smooth out the fluctuations, reducing rapid jumps.Lag CHOP: Applied a gradual transition effect:Lag Up = 1.2 (slower increase when the volume rises)Lag Down = 3.0 (even slower decrease when the volume drops)
③ Mapping Audio Data to Shape Scale with Math CHOP Mapped the volume range (0.01 ~ 0.3) to shape scale (0.5 ~ 2.0).This ensures that louder sounds gradually enlarge the shape, while softer sounds slowly shrink it, avoiding sudden jumps.
4. Connecting Audio Data to Transform SOP Mapped the output of Math CHOP to Transform SOP’s Uniform Scale, enabling shape size changes based on audio intensity. Connected Null CHOP (Position) to Translate parameters so that the shape moves dynamically with the sound. Linked Null CHOP (Color) to the color channels, allowing the shape’s color to shift depending on volume levels.
“Breathing Blossoms”: An Immersive Meditation Installation
1. Concept & Inspiration
“Breathing Blossoms” is an interactive installation that explores the energy exchange between human breath and nature. Just as plants release oxygen and humans inhale it, this project visualizes breath through digital flowers that bloom and fade based on breathing patterns. It serves as a meditative tool, helping participants become more aware of their breath and enter a state of relaxation.
2. Interaction & Design
The installation consists of a digital visualization (TouchDesigner) and a physical setup. On the screen, flowers respond to breath—deep breaths create richer colors, longer breaths generate more flowers, forming an organic visual rhythm. The physical installation features static plastic flowers and a lung-shaped structure, which symbolize human respiration. While the flowers are currently static, future versions may integrate small motors or air pumps to enhance physical interactivity.
3. User Experience
Participants engage with the installation by focusing on their breathing. As they inhale and exhale, they witness their breath manifest as blooming flowers on the screen, reinforcing mindfulness. The ambient lighting and soft sounds further enhance the meditative experience, creating a calming and immersive atmosphere.
The installation has potential for upgrades, including: 1.Small motors to make flowers physically open and close.2.Airflow mechanisms to simulate lung expansion and contraction.3.Light projections that change intensity based on breath.
“Breathing Blossoms” bridges technology, nature, and human awareness. By making breath visible, it encourages mindfulness and self-awareness. Even in its current static form, the combination of digital interaction, physical presence, and atmospheric effects creates a deeply immersive meditation experience. Future iterations will further expand its interactive potential, allowing users to connect with their breath in an even more tangible way.
I want to make a touchdesigner connected with breathing data, which can be replaced by virtual data. The effect is that every time I breathe, flowers will bloom in the screen. The heavier the breath, the darker the color of the flowers, and the longer the breathing time, the more random flowers will be lost I can’t achieve the bloom for the moment, so I use flowers in a certain range of gradually larger to demonstrate the opening process, the model is imported.
I’m looking for flower forms and tutorials that can change size
I used Noise to generate irregular shapes, allowing the flower to have an organic, natural boundary as it dissipates, rather than shrinking or disappearing abruptly. Compose is connected to Feedback, creating a layered effect that gives continuity to the flower’s fading process, as if leaving behind dust particles suspended in space.
Threshold serves as the main controller. I used two different Thresholds to process the image: one controls brightness, while the other determines the visibility range of the particles, making the dissipation process more refined. In the code section, I used op(‘constant1’).par.value1 – op(‘constant1’).par.value0 to compute a positive and negative variation, allowing the particles to dissipate progressively over time rather than following a linear transformation. During parameter adjustments, I discovered that when Threshold is combined with Feedback, the edges of the flower produce a ripple-like effect, as if it is gradually disintegrating, resembling dust scattering in the wind.
By continuously adjusting the Noise amplitude and speed, I aimed to create a more layered dissipation effect rather than an instant disappearance, making it feel more like a slow dissolution rather than fragmentation. The current dissipation effect already exhibits a certain degree of natural behavior, but some areas still feel too linear. In the future, I may introduce Flow Noise or Shaders to make the effect more organic.
At the initial stage of conception, we aimed to explore the fundamental life activity of ‘breathing.’ I envisioned using particle diffusion as a metaphor for breath, where each exhale disperses tiny particles into the air, just as the wind carries seeds or pollen. This dynamic interaction visually represents the invisible flow of energy between humans and nature, making the act of breathing more tangible and immersive. Breathing is not only an essential physiological process for human survival but also plays a crucial role in psychological regulation and meditation practices. As my thoughts expanded, I began to consider the connection between breathing and nature.
Plants release oxygen through photosynthesis, while humans absorb oxygen through breathing, forming a delicate symbiotic relationship within nature. This led me to wonder whether this invisible exchange could be visualized, allowing the audience to intuitively perceive the connection between humans and nature. From this idea, I developed the concept of creating an interactive art piece where ‘flowers’ bloom or wither in response to the viewer’s breath. Flowers, as symbols of vitality in nature, are both fragile and full of life, sharing a rhythmic quality with human existence. Through digital technology, the viewer’s breathing patterns can be captured and transformed into visual changes, directly influencing the shape of virtual flowers.
In psychology and meditation, breath regulation is a common relaxation technique that effectively influences the nervous system, helping individuals achieve a state of calmness. Thus, I hope this artwork is not only an artistic expression but also a meditation aid. By adjusting their breathing, viewers can observe the transformation of flowers, becoming more aware of their own breath and achieving deeper relaxation.
The core of this concept is to enhance the viewer’s awareness of their breathing through interactive experiences while visually providing a soothing and healing atmosphere. It is not just a fusion of art and technology but also an exploration of the relationship between humans and nature, as well as between body and mind.