Mirada Turns FX’s ‘The Strain’ into Shared VR Experience
Miradadesign, post and production studio andHeadcasevirtual reality specialists recently created an immersive VR experience to promote season two of the FX Networks horror drama series‘The Strain’, originally developed by Mirada’s co-founder Guillermo Del Toro and writer Chuck Hogan.
‘The Strain’ VR experience follows the story’s character Vasiliy Fet, a former Ukranian rat exterminator, as he leads viewers on a 360º journey through an abandoned warehouse, while under constant threat of attack from vampires. Mirada built a custom VR application to run the project wirelessly across sixSamsung GearVR headsets, allowingseveral viewersto enter into the VR experience at the same time. Their application can also playbackreal-time, vision-distorting effectsinside the VR headset, simulating blinking eyes and tunnel vision to intensify the suspense and realism.
Adapting VR Production
Mirada monitored the VR pipeline for the entire project and worked alongside Headcase from the start of pre-production. Andy, who is Interactive and New Media Director at Mirada as well as TD, said that Mirada has been working very hard to make VR as familiar and similar to traditional productions as possible. “For this project we had a script writing and pre-production phase that was identical to a normal production,” he said.
“We didn’t storyboard or previz the shoot, but the network and creatives from the existing TV series were involved with the creative development and visual directives. In fact, the production was nearly indistinguishable from a normal shoot right up until it was time to roll cameras. Staging action and production design for a 360º view is obviously much more involved than normal – for example, everyone had to leave the set or find something to hide behind - but it’s pretty easy to understand and work with, and the crew and the actor Kevin Durand adapted to it immediately.
Live action was captured using a cinema-grade spherical 17-camera rig developed by Headcase, while Andy worked on set as technical and creative supervisor. “The Headcase rig is on a motorized robotic platform,” he said. “It was certainly an odd sight to see the camera driving itself around the set, but otherwise everything was the same as if the camera only had one lens instead of 17.”
This version of the Headcase rig uses 17Codex Action Cams, which are small but powerful. They shot this experience at 30fps due to smartphone resolution and frame rate limitations, although the cameras can shoot at 60fps if desired. However, what they favoured the most was the Action Cam’s latitude of up to 11 stops, global shutter, the high data rate of the Codex recorders and the quality of their .cdx file format.
As a combined package, this provided a lot more range and detail than other, equally compact, spherical camera arrays, and especially proved its worth during the compositing and final DI stages. “We were able to push the footage much further in post than we have previously been able to with spherical video. It was much closer to what we are used to working with when shooting on high-end digital cinema cameras than what we get from lower-quality POV cameras,” said Andy.
He feels that the most important aspect of using any camera array to capture 360º video isunderstanding the rig’s limitations, and helping everyone to work within them. For instance, physical consequences for the viewer arise if the operator moves the camera too quickly, rolls the horizon, or doesn't move in a straight line. Likewise, every camera array has a distance limit for how close things can approach the lenses without requiring heavy cleanup in post to remove stitching artifacts.
“These were my primary concerns on set, as well as keeping an eye out for production design or lighting issues that could be solved during production to avoid a lot of clean up work in post. The goal of this experience was to make it as practical as possible – that is, every final shot is from a single take, which meant we had to absolutely nail every aspect of each shot,” Andy said.
Wireless Sync – Single POV, Many Users
As far as they are aware, this is the first VR app to wirelessly sync the experience among multiple headsets. While viewers were not visually inserted into each other’s POV, the fact that they were sharing the same experience at the same time had an effect similar to what an audience experiences in a movie theatre or at a live event – human presence can amplify emotions in a crowd. Mirada considers that creating a group experience with a VR app is going to be a powerful way to get an audience on board for this new medium, and the more they can do to make it a less isolating or ‘one at a time’ experience, the more people will enjoy it.
Only one POV was created for this experience – each headset was showing the same content to viewers. The multiple-viewer aspect came from the fact that each person was watching the same thing at the same time, and their experience of it was in sync. Syncing the headsets was also important for enabling the rumble packs in each seat to hit at the exact right moments to add a little ‘jump’ to specific scares.
Thereal-time image effects, heaviest at the beginning and end of the experience, were pre-animated to match the timings in the video content. Andy said, “We primarily added tunnel vision with vignettes on each eye, so that they track with the viewer’s head, and a blurred vision effect using motion blurs that are triggered by the viewer’s head movement.
“We also added eyelids that close and open using animated geometry to create a blurred ‘coming in and out of consciousness’ effect, further simulating a subjective POV for the viewer. We see a huge potential for effects like this on future projects, and have tried and like several of them. Although we didn’t use them this time, we certainly will whenever they fit the story.”
Mirada stitched the footage from the various cameras together into 360º scenes for dailies that were sent back to Headcase for editorial. Once they had locked off the edit with FX Network, Mirada proceeded to final spherical stitching, coherent clean-up and compositing, custom spatial audio playback for the project and designing the final delivery application for users.
“The experience also starts with heavy real-time vignetting and blurred vision. Most viewers don’t even realize that the downward pole is blacked out by the time they can see this effect fully because they are beyond the ‘looking everywhere’ stage of VR acclimatization.”
The VR pipeline for ‘The Strain’ was developed using a suite of tools that Mirada initially designed and customized to power theGoogle Shop VRexperience, the studio’s first immersive cinema project completed in April 2015. This pipeline hands off directly between the Headcase Codex cameras and Mirada’s stitching and cleanup pipelines.
“It is an evolving workflow. We use a lot of different pieces of software to create live-action content for VR, spanning the gamut of stitching, compression, warping, painting/patching/compositing, retiming, stabilizing, grading, denoising and so on,” Andy said. “We worked specifically to make sure that the same techniques and systems that we use to make commercials and films can be applied to this sort of work. Consequently, we can do anything in VR that we can do in traditional VFX or motion graphics.
“The interactive playback tools are less of a ‘be-all-end-all’ single VR platform, and more a wide suite of capabilities that can be combined to fit the creative needs of each project. For example, the Google Shop project required stereo spherical video playback, spatial audio and real-time navigation and interactivity, whereas ‘The Strain’ required stereo spherical video playback, spatial audio, real-time image effects and wireless sync. Our approach of creating modular systems means we can take the pieces we need ‘off the shelf’, and combine them to make each experience unique and able to perfectly support its narrative.”
To combine with the visuals, Mirada has been working with several spatial audio systems for recording, editing/mixing and playing back of dimensional audio. For this project, they again joinedTwo Big Earsinteractive audio design, aiming to push some new functions in their 3D audio engine, 3Dception, into production.
“Meanwhile, King Sound collaborated on The Strain VR’s score and sound effects, and successfully got 3Dception’s mixing and mastering tools working pretty quickly, allowing them to control the audio positioning and spatialization predictably and easily, similar to the experience that they would have on a film or TV project. The end result is a fully spatialized audio experience with sound effects, dialogue and music properly located in 3D space so you can hear them all around you as you look around.”
Andy reflected that on every project, Mirada takes on at least one aspect that is brand new. For The Strain VR, there were several. “Because no one had used the Headcase Action Cam array before, we were happy to be working with it - and were especially pleased with the data we got from it. It was a big leap in resolution, clarity and latitude compared to other arrays we have used, and has some really nice features like global shutters and synced timecode. It is the first truly professional spherical camera available, and it felt good to be able to push it the same way you can push a digital cinema camera.
“On the post production side, the wireless sync of all of the headsets was a technical but important innovation that has led to some interesting ideas that we’re eager to expand on in future productions. We also used some new software for stitching and cleanup, and helped push some of their new tools into production by giving feedback on making them even better for the next projects. The real-time image effects were something we had played with but hadn’t actually used yet. It was exciting to use them to create a more subjective POV.”www.mirada.com