Disco
2024, New York
︎
Zoe and me are collaborating to create an immersive DJ set that integrates interactive sound and visuals. The project will blend music production, real-time interaction, and projection mapping, offering a dynamic, multi-sensory experience. The goal is to explore the interaction between audio and visual elements across various software platforms.Ableton + Rekordbox + Leap Motion + Geco → TouchDesigner (via OSC) → Unreal Engine (Offworld) → TouchDesigner → Projector/Screen
Audio & Interaction Details:
Zoe designed hand gestures using Leap Motion and Geco to map and control Ableton Live MIDI and audio parameters (like filter cutoff, frequency, LFO amount etc. ), combined with Rekordbox to create the DJ performance. The audio data is then sent to TouchDesigner, which communicates with Unreal Engine via OSC.
Visual & Scene Design:
The visual component features 8 scenes in total, 3 of which are built in Unreal Engine. Moira used 5 sound parameters to influence fire intensity in Unreal Engine. These 3 camera scenes are sent back to TouchDesigner using the Offworld Live plugin. The remaining scenes are designed entirely within TouchDesigner, including one where a webcam tracks Zoe's hand gestures (since Zoe using Leap Motion), with a butterfly visual that follows her hand movements.Others all controlled by the parameter of the audio.