Framestore Brings Real-time Motion Data to Virtual Production

Framestore thor ragnarok5

Real-time functionality, live visualisation and automation are pushing motion capture into more areas of production and virtual production, while continuing to improve its original purpose of generating realistic motion data for animators. From Framestore’s Capture Lab, Studio manager Richard Graham and Senior Mocap TD Gerardo Corona Sorchini talked to Digital Media World about the new techniques and tools available to them, and how it is transforming their work, now and in the near future.

Framestore used their Capture Lab to produce scenes in two of the biggest action films of 2017, 'Blade Runner 2049' and 'Thor: Ragnarok'. True set pieces, the sequences form major plot points for the audience, advancing and augmenting the story in each case. Conceiving the location and environments correctly was essential, and so was filling the scenes with enough people to give them weight, scope and excitement.

In the case of the Trash Mesa attack sequence in 'Blade Runner 2049', the lead character K is gravely injured after making a crash landing in his Spinner vehicle and faces a violent crowd, about to swarm. To create such a crowd, people were needed to populate the sequence's wide, aerial shots.

Into the Capture Lab

Framestore's award of 300 VFX shots for the movie was already putting the VFX team in Montreal under pressure, which led to thoughts about using motion capture for this crowd. Richard Graham, said, “Our job was to produce realistic skeletal data and a variety of diverse motions for the animators apply across their digital crowd. A part of the challenge was the fact that the Capture Lab is in London, and time was short.”

Framestore blade runner trash mesa2

Deciding to use Vicon Shōgun motion data solving software with the Epic Unreal Engine for production, Richard and Gerardo Corona Sorchini also designed and built a live link between Montreal and the Framestore Capture Lab in London. Through this link, motion that the Vicon cameras captured could be streamed in close to real-time to Framestore Montreal’s cinema room, where the supervision team could watch, send back notes and request new moves.  

The capture configuration consisted of two motion capture performers, 16 Vicon cameras and a 4m x 5m volume. The data from the cameras was reviewed in real-time using Vicon Shōgun, where it was solved, output as skeletons and then streamed into Unreal Engine. The output from Unreal was then sent across the internal network using a dedicated transcontinental connection so it could be viewed in Montreal. This new set-up successfully streamed high resolution video at 60fps in 1080p to Montreal, where they could assess the fidelity of each motion with only 100 milliseconds of lag time across continents.

Live Data Transfer

The ability to function in real time is a core characteristic of Shogun, which was made to transfer live data onto film-quality assets on set in real time for virtual production. Supporting multiple actors and props, and scaling between small and large systems, are also important, but the real-time operation is what makes virtual production possible.

Framestore blade runner trash mesa5

The streaming link added still more speed to the operation. In Montreal, the team was watching the performances applied to their digital characters that had already been placed into the Trash Mesa CG environment. A second view was also supplied, which showed live video of the actual performers, who could be communicated with via a linked audio call. Using these two feeds, the project leads could make their own decisions about movement variations. These would later be cut up and delivered to the animation team as FBX files on a target skeleton, per shot per character, to tweak selectively and precisely as the story required.

Once capture and solving were complete, Shōgun could process the results fast enough to return the finalized data to Montreal in under 24 hours, batch processing all of their data so that extra time wasn't needed to convert the files to formats the animators could work with.

Extending Virtual Production

Shōgun was released in April 2017. 'Blade Runner 2049' was the first project that Framestore used it on, in place of the previous Vicon Blade system they had been using at the Lab for some time. “Virtual production is becoming more important to the bigger visual effects houses,” Richard said. “The ability to create alongside our colleagues, even when they are across the ocean, is very powerful for us to be able to offer as a service.”

Framestore blade runner trash mesa4

Gerardo, the TD, remarked, “Before Shogun, Vicon's Blade was among the top skeleton solvers for working online or offline. Nevertheless, while the software was powerful and robust, less experienced users could find it overwhelming, especially when just going through really basic setups.

“The fact that Shogun now comes in two sections makes it easier for users to focus on their project. Shogun Live, for use on the shoot, presents itself in a straightforward, easy-to-use fashion. It gets the user up and running straightaway and gives access to the core tools needed while surveying and accessing an online session. Then at the end of the day, Shogun Post is there for the stage when you have time for more functionality and need power for intense processing.”

Soldiers and Citizens for ‘Thor: Ragnarok’

The Lab's work on on Marvel Studios’ most recent superhero film, 'Thor: Ragnarok', the Framestore Capture Lab worked on an elaborate, complex battle sequence. It was different from the 'Blade Runner' project in several ways – its story is a complete fantasy, there are separate crowds each comprised of distinct character types, and motion was also required for a main character, the Hulk.

Framestore thor ragnarok6
 
The scene is a high point of the movie, calling for dozens of characters to appear on screen at once. A god of thunder battles Hela’s army of undead soldiers high above water on the narrow Rainbow Bridge as Asgard's population watches in fear. When Marvel asked Framestore for ideas and assistance, the Capture Lab had only a few days to produce performance motion for armies and citizens, using a small fleet of cameras and Vicon’s Shōgun software.
 
Populating Asgard involved citizens and soldiers, and showing how both groups respond to the battle. The undead army had to feature creatures with the grace of a warrior and the attributes of a corpse, brought to life with animation. Framestore also depicted the Hulk, in two contrasting situations. In one, he clings to a cliff after a brutal fight with the wolf Fenrir, and in the other he sulks after being told not to fight.

Framestore thor ragnarok2
 
Working within a short timeframe and mainly aiming to reduce their workload, the animators compared the pros and cons of producing and working with motion capture data, and using crowd simulation software. Opting for mocap, the Lab captured and processed motion for both the military and civilian crowds using just two actors, who were shot in their studio performing an array of diverse actions that could be applied to individually moving characters.

Real-time Review, Direct-to-Disk Recording

To help speed up the process, Framestore made use of Shōgun’s output in real time, while also recording it direct to disk. The team used Shōgun to view the performances with a mesh overlay surrounding each actor, a function that creates a fluid silhouette alongside the skeletal data, giving the team a more complete look at how the movements might look when retargeted, and how that performance would fit into the finished shot.

Framestore thor ragnarok4

Users mainly use the overlaid video to validate the solve as part of the Range of Motion stage. This set of exercises helps make sure the rig is being driven correctly - for example, making sure the shoulders and clavicles move realistically. The earlier software Blade displayed the skeleton along with the tracking markers, but in Shogun the 3D mesh makes reviewing the data more visual and decisive for animators, before moving on to the next shot.

“Our performers were running around, jumping off of decks, dying, getting shot in order to build up a complete collection of animations,” said Richard. “The motion was all recorded so it could be ready to use in our pipeline, and Shōgun helped by allowing us to create animation vignettes that could be placed directly into shots.”

Automated Marker Labelling

The software also automated the entire marker labelling process. “Better automated labelling massively reduces our need to track data manually,” Richard said. “The step that has actually been automated is the one that generates and correctly labels each mocap marker’s position in 3D space. From this marker cloud we transfer the motion to the character skeleton.” The entire process, from Live Subject Calibration to outputting clean data, can be completed within one day of the shoot.

Framestore shogun

Automated marker-labelling

Tim Doubleday, VFX Product Manager for Vicon, described this update in more detail. “Shogun Live can record the real-time data direct to disk as a .mcp or mocap file. In the past, you’d have to run time consuming batch processes to reconstruct and label the data. Now, what you see is what you get, so that all but the most complex shots can be delivered without any manual clean-up. This saves hours if not days of time in post.

“Previously the software would get confused and mislabel markers during complex interactions. For a demonstration capture session, I recently took part in an eight-person pile up. I was at the bottom of the pile but Shogun still managed to label me correctly. This is all due to a robust set of constraints that maintains plausible results.

“Shogun’s focus on real-time processing extends to subject calibration. In the past, the performer would go through their range of motion, then the technician would have to run a series of manual processes to calibrate them so that the system will recognize them. Shogun Live carries out the subject calibration in real-time as the range of motion is taking place. Once complete, they are ready to use and automatically appear in the system.”

In total, Framestore completed three separate orders for 'Thor: Ragnarok'. Each order, from the time Marvel Studios called to the delivery of the completed data, took less than five days.

Visualising a Fire Demon

Framestore thor ragnarok7

When Marvel Studios asked Framestore to create the giant fire demon Surtur, they used real-time motion capture in the planning stages. Setting up a configuration of Vicon optical cameras, the team shot to capture as much detail as possible, recording the scene at 240 fps, which is twice the standard rate of 120 fps. Shōgun was then used for visualisation, displaying a rough impression of how the gigantic monster, clothed in flames, would look as it smashed through digital terrain, helping them make creative decisions while they worked through the shots.

“On any big budget film that is heavy in effects, the schedule is going to be tight,” said Gerardo. “Shōgun gave us the confidence to deliver motion capture into our animation pipeline quicker than we’ve been able to before. That creates more options for us, and by extension, for filmmakers.”