Mocap Innovation Captures the Spirit of Ariel in ‘The Tempest’

Mocap tempest xsens3a

The Royal Shakespeare Company (RSC) has been collaborating with The Imaginarium Studios and Intel on a new production of Shakespeare’s play ‘The Tempest’, which opened on 18 November 2016 and will run until 17 January 2017, to mark the 400th anniversary of Shakespeare’s death. It also features some of the first live motion capture performances to appear in a major stage production.

On a remote island, the exiled sorcerer Prospero, Duke of Milan, plots to restore his daughter Miranda to her rightful place using magic and cunning, and the help of his magical servant Ariel. He conjures up a storm to lure his treacherous brother Antonio and the conniving King Alonso of Naples to the island. There, Prospero reveals Antonio's scheming, pardons the King, and sees Miranda happily married to Alonso's son, Prince Ferdinand.

Changeable Spirit

In his play, Shakespeare wrote more stage directions - describing stage actions, movements of performers or production requirements - for Ariel than almost any of his other characters, making this character, a spirit, one of the most complicated to stage.

“For this performance to work, Ariel needed to fly, walk in space and interact with other performers spontaneously,” said Ben Lumsden, head of studio at The Imaginarium Studios. “Without unrestricted performance capture, Ariel would have been just another earthbound cast member in a costume.”

 

Mocap tempest xsens2a

 
Several times during the performance, Ariel morphs from a spirit to a water nymph to a harpy, a vicious, mythical creature combining a woman’s figure and head with the wings and claws of a bird of prey. RSC and The Imaginarium achieved the transformation by capturing the movements of actor Mark Quartley through Xsens motion capture sensors, on an Xsens MVN Link suit, placed within the actor’s costume. His movements are transferred to a digital avatar, which is then projected onto various props on stage and in the air.

Tracking Acceleration

The untethered nature of the Xsens systems makes it possible for the actor to interact normally with cast members in his human form, while the transformations occur live as he performs on stage every night. No part of his performance needs to be recorded first. The MVN system uses inertial motion capture, as opposed to optical systems that depend on data from an array of cameras. Inertial mocap measures acceleration - that is, the rate at which the speed changes over time – in terms of both absolute and angular values.

While optical systems derive motion from positions of markers per captured frame and may need to filter the data to remove jitter, the inertial system is only concerned with the tracked changes in the speed and direction of motion, generally producing a cleaner result than camera data. In the most recent MVN release, these measurements have been refined enough to allow the Xsens Mocap Engine to capture height data from the sensors as well as horizontal movement.

Mocap tempest vicon2

As the actor’s performance is tracked, the data is processed moment-by-moment through Autodesk MotionBuilder software, and imported into the Epic Unreal Engine 4. The video output is then sent as 1080p 60fps video data to d3 servers powered by Intel Xeon processors connected to the RSC lighting desk, which in turn controls 27 projectors located around the stage.  The UE4 project replicated on two machines in case of failure, and on each machine, two different camera views are stored to enable various effects,” said Ben Lumsden.

The d3 media server, and its integrated video production application, is software based on a real-time 3D stage simulator. Productions use it to design, sequence and playback stage shows, working with props, venues, LED screens, projection, lighting and moving stage elements. It generally runs on a laptop or dedicated d3 hardware.

Logisitics on the Stage

Because the screens and props onto which the video is projected are mobile and of unusual dimensions, a separate optical motion capture system based on Vicon cameras and software – described in more detail further on in this article - is set up at the theatre to continuously supply the projectors with precise, real time location data at every moment of the show. The result handles the video and images of Ariel as augmented reality data that needs to be positioned accurately in 3D space.

Ben talked about some of the other technical implications of their choice of workflow and, especially, an inertial system. “In the Royal Shakespeare Theatre, we would be struggling to get mocap cameras behind the actor, looking back at the audience, and to keep them hidden in the set design. Also, although an optical system has the advantage of geographical accuracy, keeping a follow-camera on the virtual character in the game engine made this unnecessary,” he said. “Also, the design of the inertial sensors meant they could be incorporated into Ariel’s specially constructed costume as well. So, from an aesthetic point of view, we don’t have to place distracting reflective or active markers onto his costume, which an optical system needs.” 

Mocap tempest xsens

Magic Mirror

Early on, as the production worked out how to handle the physical and virtual iterations of Ariel’s character, motion capture was also a part of the design process, using optical and inertial systems in equal measure. “Though he doesn’t monitor his avatar’s performance during the show, Mark spent a lot of time looking at a ‘magic mirror’ set-up during the development phase of the project. He would play around on our stage in Ealing and we’d render the different forms of Ariel’s avatar on a TV screen in real time. Mark would be able to experiment with how his movements fed back into the virtual puppet, and we would make rigging changes to better enable the avatar to move realistically,” said Ben.

Fortunately, perhaps, the play does not require the actor to disappear, so actor and avatar are seen simultaneously. Ben explained that this is a creatively driven decision because the director wanted to show both puppeteer and marionette. “There is a creative logic to Ariel appearing as an avatar – be it an emotional explosion, or a retelling of a story illustrated by computer graphics,” he said. “The transitional elements as the transformations happen are orchestrated by careful choreography between the lighting desk’s GrandMA cueing system, Mark’s movements and the display in the Unreal Engine we are using to render the graphics.”

Shape-shifting Workflow

During performances every night, a number of small events have to take place almost simultaneously. Ben talked through them. “We have Xsens data driving a virtual Ariel in Motion Builder. That skeletal data is retargeted to the various forms that Ariel takes in the show - a humanoid, a sea-nymph with cloth simulation, and a terrifying harpy. In turn this retarget drives the rendered mesh [3D character models] of the avatars using the real-time graphics the Unreal Engine.

Mocap tempest vicon xsens

“The avatar is projected on a variety of surfaces in the theatre, including a moving silk cylinder that gives the appearance that Ariel is floating in mid-air. The key here is very much in the other actors and where their attention is. If Prospero looks at the avatar, we are more focused on that relationship. When Prospero addresses the physical form of Ariel, our attention shifts to him rather than the avatar.”

As for the production notes from the author himself, Ben noted that Shakespeare typically uses very few stage directions - except in this play. Consequently, the production has used every one of them happily – for example, “Thunder and lightning. Enter ARIEL, like a harpy; claps his wings upon the table; and, with a quaint device, the banquet vanishes."

The text includes numerous opportunities for Ariel to shape-shift. At various points, Prosper asks Ariel to 'Go make thyself like a nymph o' the sea’, 'Thy shape invisible retain thou still’ or 'Bravely the figure of this harpy hast thou perform'd, my Ariel'. Each one becomes inspiration for an ambitious cast and production crew. 

Augmenting Reality

The augmented reality projections of Ariel’s avatar similarly rely continuously on motion capture throughout the performance. The production team is using an optical camera system made by Vicon to track the precise locations of moving objects on stage, some of which are screens of different sizes, plus other props and some objects that the actors hold while performing. All of these objects are used as projection surfaces for the show’s imagery.

Mocap tempest xsens4

The Vicon camera data and software inform the production team of their movements on and around the stage during the performance. The cameras in use are a combination of Vicon’s Bonita and T-Series cameras, about 26 cameras in total. These are spaced around the auditorium, clamped on rails on the upper and lower circle.

Due to the camera system’s low latency tracking, the animation team has comprehensive, very accurate visibility of the objects on stage in real-time and can also manage interactions with the cast. The data is first processed by Vicon Tracker software, installed on computers using Intel Xeon and Intel Core i7 processors, and allows the stage crew to project and align images onto the objects, and follow them precisely as they move - in real-time. All of the objects are pre-calibrated, a quick process that only needs to be done once unless the markers actually move.

Tracking, Blending, Morphing

Warren Lester, product specialist at Vicon, said the scenes that use tracking vary, sometimes involving just one screen or object, sometimes four or five. “While the projectors themselves don’t move, the D3 software allows a video or other animation to be projected anywhere within the field of view of the projector. The software also manages all the blending and morphing of the imagery to present an unbroken image onto flat or curved surfaces.”

Mocap tempest vicon

One object is the hanging silk cylinder, or ‘chimney’, Ben mentions, which has markers on a weighted ring at the bottom. The complete arrangement is moved around the stage with motorized hoists upon which an avatar version of Ariel is projected. Other objects are stretched fabric screens with markers around the edges, which the spirits move very quickly around the stage. Individual animations are projected onto each from one or more of the 27 projectors in the theatre.

“The tracking needs to be smooth with low latency in order to allow the cast to interact naturally with the projected imagery,” Warren said. “These qualities also make it possible for the real Ariel to modify his performance to make his avatar move exactly as it should, even as it is being projected onto the silk chimney.

Custom Software Plug-in

Vicon and d3 Technologies have developed custom software to make the data that Tracker generates drive d3’s software, using the PosiStage.Net protocol – an industry open standard usually used to drive interactive sound, lighting and motor-driven practical effects on stage.  
“We’ve been working with Vicon and d3 Technologies to develop a plug-in application of PosiStage.Net that allows both the video projection and the software driving the stage lights to share the same tracking data protocol from the Vicon camera system,” said Pete Griffin, RSC Production manager on 'The Tempest'. 

Vicon’s Posistage.net plug-in is mainly about precision. It makes the precision and accuracy from Vicon mocap systems - typically seen in video and feature film projects – available to theatre, music and other live events. Tracking many objects accurately, at high speeds and unencumbered with equipment, remains a major challenge in most situations.

Mocap tempest vicon3

Warren said, “Because the Vicon system can measure the X,Y,Z position and orientation of many objects at once and delivers all this data in the same coordinate system, it can replace many of the different systems traditionally used on stage for different applications. Integrating all the tracking requirements into one single system brings many benefits to the production, mainly simplifying things and demystifying the coordination of all processes within the projection mapping or lighting software.”

Tracking Rigid Clusters

Warren also described what gave Vicon’s Tracker software an edge for this particular project in terms of speed and scalability. A typical motion capture application is optimized to track jointed bodies with markers on humans or animals. “In contrast, Tracker focuses on one thing - tracking rigid clusters of markers,” he said. “Consequently, it works in a completely different way to normal mocap software, which collects the images from the cameras, then computes lots of 3D reconstructions and then tries to fit the skeletal model.
 
“Instead, Tracker collects the images from the cameras and then attempts to fit 3D models of the rigid clusters to each 2D camera view. Knowing where each camera is, it can produce a set of estimates of the 6 degrees-of-freedom pose of each object. Once it has a decent set of these estimates, plus any 3D reconstructions, it can calculate the precise position and orientation. In fact, even with a single camera, it can still produce a good estimate. The continuity of the resulting data is second to none, even if many of the markers are occluded from some of the cameras.

“Because of this different mode of operation, it can provide data much more quickly than other optical tracking systems and is much more suited to large systems because of the scalability of the algorithms since it can do the object tracking concurrently for many cameras.” www.vicon.com   www.xsens.com