An installation that brings emotions to life as colorful ‘Auras.’ aura uses machine learning and projection mapping to visualize relationships derived from physical interactions and body language.
A SIMPLE CONCEPT
AURA uses Wekinator, a machine learning platform, to translate body-position data from the XBOX Kinect into projections mapped to the participant. Wekinator interprets the relationship of two people within the installation space and illustrates them using projection mapping.
Our team wanted to examine and reveal human relationships through the lens of machine learning. We wanted our output to be some kind of interactive installation that was inviting but not intrusive.
After the group concept ideation, I took on the role of coding the processing sketch that would receive data from the kinect, push the data to Wekinator and then project ‘auras’ according to the classification.
The first step was understanding the kind of information we could pull from the xbox kinect.
Then we tested mapping joint data and motion to simple shapes and then more humanoid shapes.
The visual and auditory aesthetics of each class were designed by Juliana and Surojit. Once those aesthetics were translated to code we could then train Wekinator to learn to “see” behaviors, such as aggression.
The sketch was then calibrated to the dimensions of the surface so that the projections would mirror the corresponding participant.
The Aura installation was shown at CIID in May 2018. The code can be reviewed here.