StarTrackerVFX Visualises Composites with Dual-Render Workflow

Mo sys IBC2018 demo1

In this image from a demo at IBC2018, in the screen on the right you can see the real-time render of the Xsens motion capture data animating an avatar.

Mo-Sys designed a virtual production workflow to use on set that combines its own camera tracking system StarTrackerVFX with live motion capture, a real-time game engine and ray-tracing render software. The resulting system makes it possible to visualise 3D scenes built in a game engine composited with 2D green screen elements, or 3D characters animated with motion capture data, with a high level of accuracy.

The motion of an actor performing in an Xsens motion capture suit on a green screen stage is captured into solving software, while simultaneously capturing the camera tracking data with Mo-Sys StarTrackerVFX. The camera and performance data is fed into the Unreal Engine 4 in real-time, and the shot is automatically re-rendered in real- or near-time in V-Ray raytracing render software from Chaos Group.

Live Compositing

Each component of this workflow helps productions to pro-actively make decisions about using CG elements in their projects while they are still on set, instead of waiting find out what is possible during post-production.

Based on the original StarTracker camera tracking system, Mo-Sys' newer StarTrackerVFX used in this demo includes a direct plug-in for Unreal Engine that allows you to place your live-action elements or animated CG into photo-realistic environments, and has a chroma-keyer built-in for pre-visualisation. Specifically modified for virtual productions and VFX heavy shooting, it is useful for on-set visualisation, scene blocking in virtual worlds, data-recording and, as described in this article, it supports mocap integration.  

Mo Sys startrackervfx

Mo-Sys’ integration with V-Ray also makes the workflow more practical for virtual production by producing a more descriptive render. Projects with smaller budgets – such as TV series, commercials, corporate videos, architectural visualisations and independent films – can view a superior quality render in real- or near-time, in one simple workflow.

Real-time Camera Tracking

The StarTracker hardware, which is fitted to the top of your camera, references small, identical retro-reflective stickers called 'stars' that are attached to the studio ceiling or overhead structures in a random pattern, at any height. A small LED light and sensor, mounted on the studio camera, shines light on the stars, each of which only directs light back to the sensor due its retro-reflectivity. Taken all together, these reflections define a very accurate star map that the StarTracker constantly refers to as it reports the position and orientation of the studio camera in real time to the external rendering engine. The stars' retro-reflectivity also prevents the mapping process from being affected by studio lighting.

StarTracker supports free camera movement and outputs the accurate position, rotation and lens data in real time. The camera operator can move to any position in the studio, as long as enough stars are in sight of the tracking camera. Since the tracking sensor points up instead of into the scene, the tracking is unaffected by studio conditions - moving objects, set or lighting changes, configurations, reflections and green screen sets.

Mo sys MVN BIOMECH Link

Xsens motion capture suit and data.

Mapping and calibration only needs to be performed once - other cameras in the studio can track using the same the star map – and from there tracking proceeds automatically. Because StarTracker is always referencing itself to its star map, its position is absolute and does not drift.

Performance Tracking

Mo-Sys chose Xsens for this workflow because it is an inertial motion capture system, which they wanted to use instead of an optical system for several reasons. The Xsens IMUs (Inertial Measurement Units) have an accelerometer, gyroscope and magnetometer that, once configured to match the performer's body, accurately track and record body movements without the markers and cameras that optical systems rely on to capture relative motion from frame to frame. Inertial systems only require a few sensors that function wirelessly without cables attaching users to a computer.

Inertial mocap also measures absolute and angular acceleration, that is, the rate at which the speed changes over time. While optical systems derive motion from the marker positions per captured frame, the inertial system is only concerned with the tracked changes in the speed and direction of motion, producing a clean result, and can capture height data as well as horizontal movement.

When Digital Media World met technical director James Uren at the Mo-Sys demonstration of the workflow at IBC2018, he said, “One problem with inertial suits is that they drift over time from their initial calibration and they need to be recalibrated. They don’t have any concept of where they are in the real world because all data is measured from a start point.

Mo sys Ikinema for Unreal

Ikinema plug-in

“To overcome this problem, we’ve created a prototype version of the StarTracker device, which sits on the belt of the performer, to calibrate the suit with StarTracker and correct the drift. The arrangement relocates the motion capture data into the same 3D space as the tracked cameras, which not only solves the drift issue but also reduces the need to recalibrate during shooting."

Post Production Rendering

The Ikinema plug-in was used to bring the XSens data into the Unreal Engine and render in real-time. Ikinema's LiveAction software retargets the XSens performance data to 3D avatars to animate them in the virtual world, producing matched character performances during live takes, and is integrated with Unreal Engine to support live virtual production pipelines.

James said, “However once this data is in Unreal, you can also export it as an FBX file and use it in post-production workflows, which saves time and money. V-Ray also has a direct plugin for Unreal, which means all of our data that we recorded using StarTracker with our own Unreal plugin can be exported into V-Ray, where the FBX information can either be re-rendered or imported directly into other compositing packages such as Maya or Nuke. The 're-keying' process refers to pulling the key again in post.”   

Mo sys V Ray for Unreal

V-Ray for Unreal

The V-Ray for Unreal plug-in is now in version 2 of its public beta stage, and you can use it both to bring V-Ray scenes from 3ds Max, Maya and SketchUp directly into the Unreal Editor, as well as to render ray traced, photorealistic images with V-Ray directly from Unreal. It can do this by automatically converting V-Ray materials to approximate Unreal materials, but then uses original V-Ray materials at render time. V-Ray is recognised for its handling of light, textures and materials in 3D scenes for visualisation and visual effects workflows.

Chaos Group’s V-Ray for Unreal product manager Simeon Balabanov commented, “StarTrackerVFX and V-Ray for the Unreal Engine gives users the best of both worlds by simplifying and speeding up a usually complex process. It allows users to render ray-traced, photorealistic images with V-Ray directly from the Unreal Engine, which cuts out the additional time and cost normally associated with filming on a green screen.”   www.mo-sys.com