Students at AIE, a 3D animation, game design and VFX college in Sydney, produced their end of year short film project on a demonstration virtual set and LED Wall designed and built by local systems integrator and supplier Intraware. On this set, using professional virtual production equipment, filmmakers can shoot live action productions featuring background environments built in 3D software and displayed on the LED Walls. The performances and backgrounds are recorded together, in real time.
The team of students from AIE developed the story, designed and created the 3D environments in Maya, operated the virtual set and shot the film on a RED Epic Dragon camera. They were excited about the opportunity to gain experience using virtual production techniques so early in their training. It gave them a chance to test its possibilities and limitations regarding real-time lighting and looks.
The page you are looking is not published
Intraware’s virtual set consists of a disguise media server, a Mo-sys StarTracker camera tracker, ROE LED Walls and a Brompton video processor. The central hub is the disguise gx 2 server, which takes in all 3D inputs and position information for the camera, LED Walls and on-set lighting, and delivers the images to the walls through the Brompton processor.
Real-time Storytelling
The AIE team’s first task was to develop a story that would take advantage of the set’s potential to take the character and viewer anywhere, to any location the students could create a background for. In the film, titled ‘Abyss’, the viewer sees a young man on a train deciding what movie to watch on his device on his way to work. As he checks out various titles, his environment changes, reflecting the environment of each movie.
An open-ended story like this sparked lots of concept ideas, which had to be narrowed down to four environments, including the initial train carriage setting, all of which were created with a photoreal, live-action appearance.
Project Lead Antonio Ayala said that lighting was a primary focus of this project, including controlling and managing the in-camera lighting with the real-time rendered lighting and texture effects, to achieve the desired result in the final recorded project. Executing real-time environmental changes like lighting opens new possibilities for storytelling, which you can see in the film, that they wanted to learn about.
Render Pipeline
Early on, the team had to decide how to manage the render pipeline, which modeller/texturer Byron Boyd-Morgan said was a combination of Unreal Engine and RenderMan. They had originally planned to to use Renderman for the foreground assets, for more control over looks, and go with Unreal for the background assets to take advantage of the real-time rendering.
Later, they decided to group the assets per scene and render the whole group through Unreal. However, in some cases, depending how close assets were to the camera, they needed to use more advanced shading networks that had to be rendered with RenderMan.
Compositor/Concept Artists Bronte Zhang and Samantha Walsh said that the interest in lighting and the photoreal look of the film meant all textures were referenced from real objects, and checked for continuity within the era the scene represented. Concepts were drawn up from there.
The on-set lighting presented a challenge. Lighting that made the actor look good on camera didn’t always match or blend well with the 3D lighting in the backgrounds, and this meant creating textures for assets, like the character’s radio in the war office set, for example, that look right in all kinds of lighting – not just in a CG render but in-camera as well.
Thinking and working this way was a challenge, but the benefit is the ability to respond to a director’s request for changes. The background file can be edited more or less in real time as the video comes off the camera and is reviewed.
Houdini FX
The project involves a limited number of key FX such as sand and bubbles, for example, created in Houdini. FX artist Solomon Ning said that they weren’t difficult to produce – the challenge was getting them to render correctly in Unreal. None of the FX artists had previous experience of creating FX in Houdini for Unreal, but having a chance to experiment with the software showed them that keeping the effects on a small scale was the most successful approach.
Documentation is available for the process of moving the assets to Unreal with Houdini Engine, which processes the assets and delivers the results back to the Unreal editor. These procedural assets work within the editor for content creation, and are baked out. Animation proceeded pretty much as usual for backgrounds in a regular CG film, except that the export destination had to be changed to the Unreal Engine.
Entire Environment
Possibly the biggest hurdle for the team was the need to take responsibility for the entire environment in 3D. Up until then, they had been creating set extensions for a physical set, 3D props, window replacements and so on. Therefore, learning to use the Wall itself and understanding how all of the elements – the live action set and performance, the 3D set, the disguise server, camera position – meshed together was critical.
disguise develops software called Designer in which users create files that describe the complete 3D scene for the server, including the position in 3D space of the LED screens, the cameras, set lighting, talent and so on. By combining that information with the 3D render streamed from the Unreal engine, the server can composite and deliver the backgrounds, in real time, as required to the LED screens. From there, it outputs the final mix for the production.
Post Production
Real-time benefits notwithstanding, post-production clean-up still represented a considerable chunk of time in the project’s schedule – two to three weeks within the total five-month concept-to-publishing cycle. Just like any other type of filmmaking, virtual production is subject to practical limitations.
For example, the camera frequently recorded Moire patterns in the LED backgrounds, which depended partly on the distance of the camera from the set. However, because the size of the LED screens was fairly limited, the camera could not be pulled back too far. Bronte Zhang in compositing found that slightly defocussing the image, or changing to a YCbCr colour space and blurring all channels except the green channel, helped soften the Moire effects.
Another task was relighting the actor in certain shots where the LED lighting and on-set lighting were not working well together, which involved first rotoscoping the actor from the shot in Nuke, in order to correct the lighting.
The team is already thinking ahead to their next chance to work on a virtual production set. One experience they missed out on was real-time, moving camera tracking, which is an important element of most virtual productions. Unfortunately, the rail system needed to operate the camera motion was not available at the time of their shoot. However, a moving camera would have called for another substantial learning curve and divided the focus they were able to devote this time to real-time lighting, textures and looks. www.intraware.com.au