Audio Visual Performance with Visuals made in Unreal
The „Resonance“ project merges audio and visual elements into a seamless, interactive live per-
formance. At its core, the DJ uses an XONE MIDI controller to simultaneously manage both lay-
ers, audio and visual performance. Musical signals are transmitted in real time to Unreal Engine
using low-latency tools like loopMIDI and rtpMIDI. Within Unreal, these MIDI signals are converted
into dynamic commands via the node-based Blueprint Visual Scripting system.
This technical setup enables the precise control of a range of visual effects, such as Niagara Par-
ticle Systems for complex particle animations, Alembic animations for detailed 3D object move-
ments, specialized material effects, and post-process camera effects, all synchronized to the
musical performance. The modular architecture allows the DJ to flexibly adapt visual compo-
nents on the fly, creating an immersive, interactive show where audio and visuals react in perfect
harmony.