I followed the flowchart and connected nodes like Audio Device In CHOP, Analyze CHOP, Math CHOP — volume and spectrum data came through fine.
But at first I just linked the volume value directly to particle size and speed — the result was really stiff. Particles either stayed still or suddenly jumped, nothing like the “sound mountain” I had in mind.
Then it hit me: driving all particles with a single value flattens everything.
So I switched back to a Python script inside Geometry COMP, looping through each particle per frame, using Noise TOP to give each point its own base height, then scaling the whole thing with volume.
Now the movement feels natural — when the sound gets loud, the whole “mountain” rises, but each particle keeps its own little wobble.
One thing that really stuck with me:
Sound is a trigger, but it shouldn’t be the only boss. Visualization isn’t about translating sound into an image — it’s about giving sound something that’s already alive, and letting it move along.

