👽 ALIEN AVATARS 👽


ITP - Performative Avatars/Learning on the Edge







Part 1 - Building a Foundation


INSPIRATION

I first discovered live music events in high school, which was also around the same time as when I discovered my love for electronic music. Fast forward 10 years, and I find myself at the intersection of creative expression and technology. In more recent years I have been hearing about how real-time graphics can help power live musical performances and have been searching for a reason to explore this use case. That brought me to realize that the perfect application of this technology was for my final project for Performative Avatars, an ITP class taught by Matt Romein. With lots of newfound Unreal skills, as well as the desire to create an audio reactive experience, I thought this would be the perfect application of this idea.




Screenshot of LiveGrabber






PROCESS

This project was made possible using everyone’s favorite (or at least my favorite) data protocol: OSC! I found this handy collection of Max for Live devices called LiveGrabber to analyze what is playing inside Ableton and send out values based on certain triggers. Once I successfully had OSC data coming out of Ableton, I created a particle system inside Unreal. With the particle system created, I tested out a few different configurations with the OSC data routed to different properties of the particle system. The last major thing I added was several different avatars posed in different locations around my scene, with each group of avatars having a different animation. I also figured out a way to send OSC messages from my MIDI controller, so that it could not only control the playback of music, but also change the camera angles inside Unreal. With all of this tested and working flawlessly, it was time to get jamming! 





Part 2 - Building Bigger


END GOAL

As the first 7 weeks of the semester came to a close I began a new course titled Learning on the Edge, taught by Scott Fitzgerald, which was an exploration in running machine learning models on microcontrollers. I wanted to expand on this system and find an additional way for the performer to change the visuals easily and thought this class could help me achive that. Therefore, througout this class I ultimately created a wearable dance pose recognizer that can detect a user’s dance pose and trigger events based on the detected pose.





PIPELINE

In order to get the microcontroller to recognize different dance poses I used the Magic Wand sample code. I trained the model with 3 dance moves and then packaged the model for my Arduino Nano 33 Sense board. The hard part was figuring out the best communication protocol to get data from the wearable device into Unreal. I ultimately used a hacky sort of solution provided by Scott that uses the BLE capabilities of the board to communicate with a p5 sketch.  The sketch then uses a Node server to send data to Max which ultimately sends OSC messages to Unreal.