ITP - Performative Avatars


I first discovered live music events in high school, which was also around the same time as when I discovered my love for electronic music. Fast forward 10 years, and I find myself at the intersection of creative expression and technology. In more recent years I have been hearing about how real-time graphics can help power live musical performances and have been searching for a reason to explore this use case. That brought me to realize that the perfect application of this technology was for my final project for Performative Avatars, an ITP class taught by Matt Romein. With lots of newfound Unreal skills, as well as the desire to create an audio reactive experience, I thought this would be the perfect application of this idea.

Screenshot of LiveGrabber


This project was made possible using everyoneโ€™s favorite (or at least my favorite) data protocol: OSC! I found this handy collection of Max for Live devices called LiveGrabber to analyze what is playing inside Ableton and send out values based on certain triggers. Once I successfully had OSC data coming out of Ableton, I created a particle system inside Unreal. With the particle system created, I tested out a few different configurations with the OSC data routed to different properties of the particle system. The last major thing I added was several different avatars posed in different locations around my scene, with each group of avatars having a different animation. I also figured out a way to send OSC messages from my MIDI controller, so that it could not only control the playback of music, but also change the camera angles inside Unreal. With all of this tested and working flawlessly, it was time to get jamming!