The ‘Love Has No Labels’ PSA, now a viral YouTube video, employs
a new pipeline using wireless motion capture and puppeteering to
animate 3D characters with live motion data in real time.
Mindride & Xsens Deliver Live Mocap Performances to the Screen |
‘Love Has No Labels’is a program created by The Ad Council in the US to expose subconscious biases people have about couples, related to colour, age, gender, religion, sexuality and disabilities. A PSA was staged and filmed live on the street and employed a large LED screen on which viewers see pairs of skeletons, rendered X-ray-style, hugging, kissing and dancing together - before stepping out from behind the screen and revealing themselves as people to the audience. Variations include same-sex, elderly and interracial couples, provoking the audience to re-consider the validity of their biases. |
![]() |
Yehuda Duenyas, Chief Creative Officer at Mindride, said, “Our primary work lay in achieving the most human performance while animating the skeleton avatars in a live experiential setting. Our goal was an uninterrupted, integrated real-time experience in which the actors can perform and interact naturally whether they are behind or in front of the screen.” |
The page you are looking is not published |
“In short, how can we sell the idea of an X-Ray machine?” Yehuda said. “We wanted to define the steps required to sell that human quality without any post production, live on location.” |
![]() ![]() |
But another important aspect of the MVN system is its use ofinertial motion capture, as opposed to optical systems that rely on camera data. Inertial mocap measures both absolute and angular acceleration, that is, the rate at which the speed changes over time. |
![]() ![]() |
ForBrian Emerson, lead CG artist and animator, the main challenge was taking imperfectly aligned data and finding a way of making two characters appear to interact and believably move about in the same space. “Our technique was plugging a MIDI controller into Maya using a program called MaxMSP. Our animator could more or less puppeteer the characters and reposition them in real time during a live performance. When characters had to hold hands or hug, we could re-position them into the right spot at the right time.” |
![]() |
Considering the implications of this project for motion capture in the future, Michael Todd said, “I’m anticipating a lot more integration with virtual reality and augmented reality equipment. As development continues, inertial sensor-based mocap systems are getting cheaper and can be deployed in a wide range of environments to the point where even a small project can afford to buy its own system and integrate it into a VR experience or a game or a real time event – like this one.” |
|
A making-of short about the PSA can be seenhere. |