Dimension and MRMC put their skills together to bring intriguing holographic effects to life in Season 2 of HBO’s Avenue 5, starting with volumetric capture inside MRMC’s Polymotion Stage.

MRMC dimension Ave 5 3

Dimension Studio and MRMC have put their skills together to bring to life and enhance the realism of some of the most interesting visual effects seen in Season 2 of HBO’s TV series Avenue 5.

As a fast-paced comedy taking place on board a luxury interplanetary cruise ship, the environment is already full of futuristic effects. But Season 2 also included an intriguing kind of person-to-person communication involving small holograms of the characters, displaying from wrist watches, holographic screens and desks.

Mobile 3D Production

The holograms were created and animated by transforming the live action actors into realistic 3D holograms using volumetric capture techniques. The VFX pipeline team at UNIT VFX company then worked with the volumetric assets, designing a robust pipeline to integrate the 3D projected characters into the TV footage.

Dimension, who specialise in volumetric content and virtual production, began asset creation inside MRMC’s Polymotion Stage. The stage is equipped with an array of 106 cameras that capture every minute detail – simultaneously, from multiple angles – of the actor, who stands in the centre and performs as required.

MRMC dimension Ave 5 2

Polymotion Stage is, in fact, a mobile studio housed in a vehicle and purpose-designed for the creation of volumetric video. The vehicle travels to wherever the talent are located and expands into a capture space. The resulting outputs are MP4 video and OBJ files, which means they are small, flexible and can be used in various traditional media including broadcast and 2D video, or in augmented and virtual reality applications.

MRMC, best known for their motion control gear, robotics, camera slides, arms and tripods, are the developers of Polymotion Stage. They work with Dimension who use the stage to serve clients that need volumetric characters for crowd FX, holograms and other realistic 3D work.

Inside Polymotion

Due to the number of cameras involved, this setup results in over 10 GB of footage per second for the team to process, but it does give VFX teams a huge amount of control over angles and camera paths, and ultimately produces characters that are true 360° virtual humans. The approach not only gave more scope to the role that the communication devices could play in the story but also allowed the production crew more creative freedom in presenting the holograms in the various scenes.

The 106 video cameras that make up the array inside the stage are made by IO Industries – 53 RGB models and 53 infrared. Of the total, 96 cameras are positioned around the capture volume facing inwards, and 10 are above pointing down. The RGB cameras read and record the colour required for the .png texture map, and the infra-red cameras record depth and position in space for creating the mesh.

MRMC dimension Ave 5 1

The actors were filmed in the volumetric stage with global illumination over green screen. Once the volumetric assets were processed, the post production team worked with the showmakers to select angles and camera paths, according to the scene requirements. Then they would composite the assets into the footage, while handling the lighting effects during compositing in order to feature the hologram in the final image from any perspective.

Data Processing to Final Render – a Volumetric Transformation

Because working with Polymotion Stage and the volumetric assets that it produces, needs a number of steps to be carried out either simultaneously or in a special order, MRMC and Dimension shared further details about their work.

Dimension has an on set team that operates the stage and leads the volumetric capture. Dimension’s technical artists process small frame ranges of data to produce a 3D virtual human that can be rigged and animated for authentic movement matching the actor’s performance. A second team creates character templates and performs in-depth optimisation of the volumetric assets.

To do this work, the teams use Microsoft’s Mixed Reality Capture solver and proprietary Microsoft software, which processes the 2D content captured on set into a 3D model. This model could be handed off to UNIT and HBO to use in their VFX workflow in post.

MRMC dimension Ave 5 4

The optimisation step is necessary and needs to be done carefully to make sure that the client gets the best possible assets for the final file formats they need. Some rendering can be done by the technical art team on set, to enable the client to review performance, but the final rendering is done in post in the weeks following the shoot.

These renders are what were supplied to UNIT and HBO for Avenue 5 and were delivered with a 3D virtual human, forming a replica of the actor’s performance in the volumetric studio that allowed the VFX team to choose the point of view and show the holographic performance from any angle.   www.mrmoco.com