The ‘Love Has No Labels’ PSA, now a viral YouTube video, employsMindride-xsens-love-labels
a new pipeline using wireless motion capture and puppeteering to
animate 3D characters with live motion data in real time.


Mindride & Xsens Deliver Live Mocap Performances to the Screen

‘Love Has No Labels’is a program created by The Ad Council in the US to expose subconscious biases people have about couples, related to colour, age, gender, religion, sexuality and disabilities. A PSA was staged and filmed live on the street and employed a large LED screen on which viewers see pairs of skeletons, rendered X-ray-style, hugging, kissing and dancing together - before stepping out from behind the screen and revealing themselves as people to the audience. Variations include same-sex, elderly and interracial couples, provoking the audience to re-consider the validity of their biases.
Mindride-xsens-love-labels


The LED screen is not, as viewers are left to imagine, an X-Ray machine. The skeletons are 3D computer-graphic characters, animated on the fly with live motion data captured and processed in real time by experiential design firmMindride. Their team took the opportunity to develop a new pipeline using theXsens MVNmotion capture system,Max/MSPsoftware and aMIDI controller. The team devised a method to stream the actor’s movements, in real-time into a rendering computer that animated the 3D skeletons. 

Yehuda Duenyas, Chief Creative Officer at Mindride, said, “Our primary work lay in achieving the most human performance while animating the skeleton avatars in a live experiential setting. Our goal was an uninterrupted, integrated real-time experience in which the actors can perform and interact naturally whether they are behind or in front of the screen.”

The page you are looking is not published

View Media


To prepare the models adequately for animation and help the artists understand how the suits behave during capture, they experimented and rehearsed with motion capture actors. Meanwhile the skeleton models were created and rigged in Maya. In the software, they artists place the models on top of the mocap data, which was rendered out as a visualisation, to figure out the scope for achieving highly naturalistic performances.

“In short, how can we sell the idea of an X-Ray machine?” Yehuda said. “We wanted to define the steps required to sell that human quality without any post production, live on location.”

Mindride-xsens-love-labels4 Mindride-xsens-love-labels-R-De-Jesus


Mindride’s technical leadMichael Toddexplained that the Xsens MVN system worked especially well  for this project because of its convenience for the performers and crew. To get the results they needed for the animation, they only had to use 17 tracking markers, which are small enough to hide under clothes and, most important, are wireless, functioning without cables attaching the user to a computer. “There were no camera set-ups to calibrate and tune, and the MVN Studio software integrates well with Maya and other software we use to render the skeletons,” he said.

But another important aspect of the MVN system is its use ofinertial motion capture, as opposed to optical systems that rely on camera data. Inertial mocap measures both absolute and angular acceleration, that is, the rate at which the speed changes over time.

Mindride-xsens-love-labels-B-EmersonMindride-xsens-love-labels-R-De-Jesus3


While optical systems derive motion from positions of markers per captured frame and often need to filter the data to remove jitter, the inertial system is only concerned with the tracked changes in the speed and direction of motion, generally producing a cleaner result. In themost recent MVN release, these measurements have been refined enough to allow the Xsens Mocap Engine to capture height data as well as horizontal movement.

ForBrian Emerson, lead CG artist and animator, the main challenge was taking imperfectly aligned data and finding a way of making two characters appear to interact and believably move about in the same space. “Our technique was plugging a MIDI controller into Maya using a program called MaxMSP. Our animator could more or less puppeteer the characters and reposition them in real time during a live performance. When characters had to hold hands or hug, we could re-position them into the right spot at the right time.”

Mindride-xsens-love-labels-R-De-Jesus2a


MaxMSPis a visual programming language that helps users without code-writing experience to build complex, interactive programs – such as audio, MIDI, video, and graphics applications – especially those in which user interaction is needed. MaxMSP is made up of three parts. ‘Max’ handles discrete operations and the MIDI.  ‘MSP’ deals with signal processing and audio. ‘Jitter’ is for graphics rendering and video manipulation. If you can track objects with something like the Xsens sensors or a camera, the MaxMSP software will track the x and y position and rotation.

Considering the implications of this project for motion capture in the future, Michael Todd said, “I’m anticipating a lot more integration with virtual reality and augmented reality equipment. As development continues, inertial sensor-based mocap systems are getting cheaper and can be deployed in a wide range of environments to the point where even a small project can afford to buy its own system and integrate it into a VR experience or a game or a real time event – like this one.”

Mindride-xsens-love-labels2

 
Yehuda said he especially sees potential for live applications. “What we are doing is digitally translating human behaviour and action, using it to drive these beautiful skeletons. It could apply to any avatar. If we can deliver that in a live situation – that is meaningful.”  www.xsens.com

A making-of short about the PSA can be seenhere.