top of page

The flow of the interaction was equally important. Users first land on a blurred “Which Party Tonight?” screen, with a call-to-action button to “choose.” On selection, the blur fades away to reveal the building in sharp detail, with each flat labelled and equipped with an enter button. Clicking on any flat takes the user inside that specific room, where the background changes colour, the music begins, and the visuals start to pulse dynamically with the beats. A simple back button allows users to exit and explore different flats, ensuring the navigation feels fluid and intuitive.

 

Beyond the music-driven visuals, I introduced a live microphone input. This addition enables user participation by visualising voice input in a dotted waveform box at the bottom of the screen. The mic input acts independently of the room music, giving the experience a playful and performative edge — users can clap, talk, or sing, and see their voice ripple across the interface in real time. Together, this dual system of music-driven and mic-driven visuals makes the project a multi-layered interactive canvas.

From a technical perspective, the project was built entirely with p5.js and the p5.sound library. I employed FFT (Fast Fourier Transform) analysis to break down audio into frequency bands, which then drove the generative graphics. PeakDetect allowed me to identify beats, creating bursts, pulses, and transitions on cue. I also handled real-time microphone input, ensuring smooth performance and clear visual feedback. On the design side, I created and optimised all building graphics, icons, and overlays in Photoshop and Illustrator, and carefully mapped visual states to align with each narrative.

 

What makes this project special to me is that I quite literally vibe-coded my vision to life. It combined my strengths in UX design, creative coding, and storytelling into a cohesive prototype that is both technically robust and visually playful. More than a sound visualiser, Visual Sound Analyser is an interactive story — one that turns an ordinary building into a vibrant multisensory world where music and imagination live side by side.

Screenshot 2025-09-29 at 11.47.07 AM.png

This project, titled Visual Sound Analyser, is an exploration of how music, sound, and visuals can merge into a fully immersive interactive experience. My concept was to transform a simple building cross-section into a living, breathing party hub where each flat represents a different “vibe.” The goal was not just to create visuals that responded to sound, but to design an environment where music, interactivity, and storytelling converge — a space that invites curiosity and play.

The building is divided into five flats, each carrying its own narrative and musical identity. A jazz dinner party ripples with concentric circles; a children’s birthday party bursts with confetti; a weekend party pulses with shards of light; a dinosaur chilling alone fills the screen with smoky particle trails; and a couple reading together is surrounded by mandala-like spokes and orbiting dots. Each of these rooms not only plays distinct audio tracks but also generates real-time visualisations that embody their mood.

Vibe Coding is Addicting

I created an immersive interactive experience that turns a cross-sectional building into five unique party “flats,” each animated through its own audio-driven visual language. From jazz ripples to birthday confetti and mandala patterns, every room captures a distinct vibe. Using p5.js, p5.sound, FFT analysis, and mic input, I successfully vibe-coded my vision into life; blending UX design, creative coding, and generative visuals into a playful, multisensory prototype.

bottom of page