The Foundry’s Jon Starck Tells Stories in VR

Virtual reality is a hot topic in content production and consumption at the moment. CES 2016 was dominated by VR display systems like Samsung Gear and Oculus Rift, with more on their way soon. Producers and audiences have both been talking about VR for decades, and producers finally have the processing and graphics power to make it a reality.


Justin Lin’s ‘Help’ used just four cameras, which gave Lin the freedom to move the rig on set, but resulted in many challenges when stitching the images together in post.

But while the hardware and software systems are till coming together, Jon Starck, head of research at The Foundry, says there is arguably an even bigger challenge ahead - what are we going to do with them? What content will we be seeing on these devices? In particular, what are the opportunities for narrative content? Can we use VR to tell stories? Jon talked to Digital Media World about The Foundry’s ongoing efforts to support post production artists working on virtual reality projects.

Cinematic VR

“Using virtual reality for storytelling is what some people are calling ‘cinematic VR’,” Jon said. “It will probably be a combination of live action filming and CGI, but it will have a story running through it. We know the grammar of narrative movie-making - we have been doing it for more than 100 years. Within the field of view we use visual clues, such as focus, lighting and movement, to draw the eye and ensure our viewers are watching the critical points of the action.
“VR is different. The only point of going to a VR environment is to allow the viewer to explore a wider, perhaps a 360˚, field of view. But we do not want to risk letting the audience miss a key part of the action because they were looking the wrong way at the time.”
While the creative people work through these and many more decisions to develop content audiences will want to watch, the developers building the tools to support them have to follow closely to resolve the issues that could limit their creativity.


Jon Starck, head of research at The Foundry.

What camera rig is best, and what field of vision? For narrative content, do you need a 360˚ x 180˚ view? If so, how many cameras do you need to capture it? Obviously, the more complex the camera rig the more post production you will need to join all camera outputs into an uninterrupted scene, and the more rig removal will be needed to clean it up.

Stitched Up

Jon said, “One of the most striking cinematic VR productions to date is Justin Lin’s ‘Help’, produced under the Google ATAP project. This production chose to use just four cameras, with fish-eye lenses. It gave Lin the freedom to move the rig around smoothly, but it generated a lot of challenges to sort out the geometries to stitch the pictures together.
Stitching is the number one issue in post for VR. It is far from simply chopping out the right fields of view from the cameras and pushing them together edge to edge. It is impossibly distracting for viewers moving around a scene to see misalignments, ghost images or shifts in colour or focus. The last is a particular challenge because it is virtually impossible to align the entrance pupils of multiple lenses on a rig, and so this adjustment has to be done in post.”
Jon also reminds artists that they may have to do all of these computations on high frame rate and high dynamic range content. While the display refresh rate must be high to be responsive, some technical commentators suggest that high frame rate capture may also be required. This becomes another part of the creative exploration of the medium. “The general message is that data files are going to be huge,” he said.

On-set Playback

“On the other hand, the director and DoP need instant feedback and verification that what they have shot will work. On a conventional shoot, they will look at a playback monitor. Directors and camera operators will be looking for some sort of instant playback system for VR production that gives them at least some sort of reassurance.

“What does this mean for companies developing post production equipment? You need to take the standard toolkit we have today - including CGI elements, set extensions and creative colour control - and add to it the new toolsets handling stitching and geometry correction.
“At the Foundry we have identified four key areas of development to pursue. First is what we call the camera solver. This figures out how the rig is laid out, calculating the precise distances and angles between the lenses, and the geometry of those lenses, to make sense of the data.
“The second key tool is the stitcher. As already noted, this has to build a convincing immersive environment, including a sense of depth. Even in a nominally 2D environment the viewer must feel that there is real depth to the scene, with closer objects appearing more ‘real’ and more involving. There has to be a reason for the viewer to explore the scene, as well as being visually directed to the key action.”

The Compositing Environment

Next comes the compositing. The company’s NUKE platform is highly already regarded as a compositor, and it is natural to extend this platform to VR. In this environment, the requirement is for a tool that has the flexibility to correct the stitching, selectively grade parts of the immersive image, and paint out unwanted elements in rig removal, as well as placing virtual elements into the real environment. For such diverse capabilities, the team needs to focus on sheer processing power to make post fast enough to be economical, while working to even more exacting technical standards.
Finally, the research team is addressing the need for live preview with an interface to the Oculus Rift headset.
Jon emphasised that this research remains a work in progress. “We all have to learn what real applications are going to require. While our toolset is still under development, we are working in collaboration with many of our customers who have also had to learn on the job and are using NUKE to create great VR content - like ‘Help’. Their experience is an enormous asset to us as we finalise a generalised toolset,” he said.
“2016 is without doubt going to be the year that makes or breaks VR. While creative directors are seizing the opportunities it brings, we have to have tools to support them.”