Filmmakers Tame Ultra-wide Video for Taronga Zoo’s Wild Screen

Taronga ben allan post4

Taronga Zoo in Sydney has installed a 270° screen in their purpose-built Centenary Theatre that displays 5.4K immersive video with a result similar to a virtual reality experience or IMAX screen, but doesn’t rely on a dedicated device or glasses. 

Inside the theatre, the screen curves a full-size 270° image around the audience, extending past the peripheral vision of people sitting at the centre of the arc. In contrast, the imagery inside a VR headset is only as immersive as far as viewers physically turn to ‘see’ what is around them. The Zoo’s screen has a 5:1 aspect ratio, which is twice as wide as Cinemascope, and measures 5m high by 25m wide, as wide as an IMAX screen.

Taronga Zoo’s goal for the screen is to use emotional storytelling against a visual backdrop that feels like a blockbuster film to teach kids about animal conservation, and inspire them to take action. Director of photography and producer Ben Allan, who runs Main Course Films in Sydney, and writer-director Clara Chong are behind Taronga Zoo’s first short film production for the screen, a 12-minute film with a serious conservation message. It debuted in June 2017 and is now used to welcome its 1.7 million annual visitors.

Under the Zoo

In the story Clara conceived for ‘Wild Squad Adventures’, a family trip to Taronga Zoo turns into an action adventure when they discover a secret world hidden beneath the Zoo. They meet special agents ‘for the wild’ who take the family on a global mission to protect and care for all animals. The project is entirely live action, filmed in real locations with no green screen work and only two major CG elements.

Taronga ben allan post7

The theatre screen requires imagery to be projected onto its curved surface from three laser phosphor projectors in a configuration similar to the Cinerama widescreen system from the 1960s, but the Taronga screen is much wider and more curved. This film plays out in one continuous image across the screen, which is different to other wraparound projection systems such as Barco Escape or ScreenX that use three separate screens for the front and sides.

Furthermore, up until now, films have only incorporated certain scenes, mostly computer generated, that use the extended screens and immersive cinema systems - not the complete story. For example, seven minutes of running time in the film ‘The Maze Runner’, and 20 minutes of ‘Star Trek: Beyond’ that was scattered through the film, played out across three separate screens. In comparison, the Zoo’s 12-minute film uses the full 5:1 screen while the complete drama plays out in a single, ultra-high resolution image.

Cameras and Rigs

Ben undertook a considerable testing period with cameras, lenses, lighting and workflow processes to make sure that data management and delivery could be carried through for projection on the theatre screen. He said, “Most of the film was shot on Blackmagic URSA Mini 4.6K cameras, which I chose both for image quality and the flexibility of their design. Then in post, we cropped the frame to the 5:1 aspect ratio. However, to create a continuous image at such a high resolution, we used both a triple-camera and single camera coverage.”

Taronga ben allan post5

Because ‘Wild Squad Adventures’ was produced as a live action drama, shot on location with as much as possible captured in camera, Ben focussed on creating different looks for different scenes, partly by using specialist cinema lenses from Angenieux and Carl Zeiss, but also with footage from a custom-built camera rig.  

“To augment the URSA Mini video, my company Main Course Films and drone operators XM2 combined our expertise to build a 3-camera system we named the Trident, which was specially mounted in an arc configuration to match the theatre screen and flown on a modified Octocopter drone to capture immersive aerial views for our big establishing shots. Conventional drone gimbal mounts are not capable of shooting an unobstructed 270° view. We used it on the ground as well, both at the zoo and in the Wild Squad HQ location,” Ben said.

Ben chose three Blackmagic Micro Cinema cameras for the Trident rig, due their combination of size and image quality, as well as the fact that they are based on the same colour science as the URSA Mini 4.6K. The system uses non-fisheye 10mm lenses with minimal distortion, which is also unlike other immersive camera rigs. The lenses are made by SLR Magic in Hong Kong.

Re-writing the Rules

Ben found that working toward playout in this screening format results in a new canvas for making a film. “Its format and aspect ratio meant re-writing many of standard rules of filmmaking, and had an impact on everything from storyboarding to cameras and lighting, and from directing talent to data management, editing and overall workflow,” he said.

Taronga ben allan post8

“A major challenge was that, like filming for virtual reality, introducing too much movement or too quick an editing style can have a real, physical impact on the viewer, causing nausea. Like IMAX, rather than being driven by eye lines, this film is driven by horizon lines. So we needed to think about not only what’s front and centre, but what’s happening all around us - and, rather than storyboarding by frames using conventional shot design, we had to build the film with scenes.”

For example, a close-up usually means a single character is featured in frame, but within the 5:1 aspect ratio, even an extreme close-up will almost always include most other people on set. This required careful blocking of the action so that shots would allow the key characters both to be featured and to cut together in the edit. Another example is dealing with the 180° line when the audience has a 270° view. This was particularly relevant in any dialogue scenes where characters are looking across at each other. What felt right on screen was often completely at odds with the required shooting technique.

Big Vision

“Consequently, every shot in the film had to be carefully storyboarded in the 5:1 frame, and was pre visualised as an animatic to determine the creative direction for the film including music and sound design,” Ben noted. “Directing the actors, for example, required working out blocking and movement across the massive width of the screen and identifying which elements would be featured centre of camera and, from there, left and right of frame. Rehearsing and blocking each scene was also critical because all actors were on screen significantly longer than on conventional screens. If they fell out of character or flub a line, we couldn’t cut back-and-forth or away from performance to the same extent we normally could.”

Taronga ben allan post3

The project’s scriptwriter and director Clara Chong created the first animatics as very simple, graphical representations of shots from a short series of options, using the Placeholder Generator in Final Cut Pro X. Her process then evolved by adding visual reference stills and video and eventually she could draw the actual storyboards ‘by hand’ in Photoshop on a Wacom tablet. Due to the resolution independent nature of Final Cut Pro, all of this work could be done in a 5:1 timeline.

On-set Logistics

Once on set, they found that the 5:1 format required the use of much bigger lights at much greater distances from the action to create the same effect. In itself, the lighting style was not radical. But while a close-up, for example, can normally be lit with a small light just out of shot, the width of the frame often meant that they had to position the lights much further back to keep them out of frame.

“In turn, this required bigger units to get the light level and softness required. For example, for shots that I might be tempted to light with an LED or Kino-Flo on a conventional format could easily end up needing a 6000W HMI through a 6x6 frame. The gaffer Steve Schofield did a fantastic job of making this work,” said Ben.

Taronga ben allan post9

“Regarding the Trident, while we could use it for key shots planned carefully during preproduction, its limitations of a fixed focal length plus the 3m minimum convergence distance meant that it really only worked for the very big, wide establishing shots, the aerials and the ultra wide shots in the HQ location where we staged the reveals of the project’s two main visual effects, a large hologram and a rock wall. Once those shots were done we switched to a much more conventional shooting process, even though we had to frame for the 5:1 ratio.”

Data Wrangling

Data management required full time data wrangling. During the main shoot, the production generated around 1.5 TB of data per day which had to be backed up in duplicate. The data wrangler on set, Meredith Calthorpe, moved directly into the Assistant Editor’s position when post production started, saving a lot of time because she was already familiar with all the footage and where the clips were located.

Ben said, “I usually like to have data management done away from set in a more stable environment like an edit suite, but because of the speed we were shooting at, the amount of data we were generating and the limited supply of adequately spec’d memory cards, we needed Meredith to be close to set. So we set her up with an area both at the zoo and the HQ location where she had reasonable desk space and AC power. Our camera department intern ran the cards to and from set for her. They were first copied to a very fast external solid state drive that could handle the footage in close to real time. Once the card was freed up, it was returned to set for re-use.

Taronga ben allan post11

“Meanwhile, the footage would be copied to two regular hard drives of different brands for transport to post, and the SSD was cleared to make space for the next material. Simultaneously using two Macbook Pro’s, each with an SSD and two HDD drives attached, meant that Meredith was able to keep on top of the data flowing in. We never had to stop the shoot to wait for cards, and not a single frame was lost or damaged.”

Stitched Up for Post

Back at Main Course Films, the Cinema DNG RAW footage was loaded into their DaVinci Resolve workstation and a custom LUT applied before rendering the proxy files to go to editorial. These files were at half-resolution and in the ProRes Proxy format - exactly as FCPX would have done internally, but by generating them in Resolve, they could both use the custom LUT and also have a Resolve project with all the media available and ready to conform.

Ben considered the challenges of working with the footage in post. “We had more than half a million individual files in the raw footage and our source footage plays back at over 20GB per minute. The ‘low resolution’ files for editing the film were still higher resolution than most of the world’s cinemas. In fact, no existing cinema in the world can screen the film at full size and full resolution. We screened the work-in-progress edits at the biggest auditorium available to us at the Hayden Orpheum Picture Palace, which has a DCI 4K projector, but for playback the image was still less than half the size of the Taronga theatre and had to be down-converted to Cinemascope 4K anamorphic for the screen there.”

Footage from the Trident camera rig had to be digitally stitched together in order to create a continuous image. First, Meredith roughly stitched it in Fusion in a shot by shot warping process without any fine-tuning, as part of her prep so that the files could be used for editing.

Taronga ben allan post6

Once these shots had been locked off, the three sets of Cinema DNG shots were de-bayered in Resolve and rendered out as 16 bit EXR files before they were sent to the post studio at Holy Cow where Graham Davidson completed the fine stitching. Because of the size of the projected image and the proximity of the audience to the screen, automated stitching processes weren’t an option. 

Warping the edges of the three images had to be done manually frame by frame for every shot to achieve the level of accuracy required. Fusion has a variety of tools for this, but primarily Grid Warping. The result was invisible blends that could withstand being pulled apart and joined together again for the three projector-system.

Offline Edit

The initial rough stitch took about three days, but had to be done before the offline edit could begin. In the meantime, Clara was already editing the single camera footage. Because she had developed her animatic edit by then to the point where it had temporary music and a fair number of sound effects, she was able to start cutting sequences even while the shoot was still underway.

From an editing point of view, the process was straightforward. All footage from both cameras came into Final Cut Pro X as ProRes Proxy material at half of its source resolution. Nevertheless, the bulk of the footage was still higher than 2K cinema resolution and held up just enough in the 5.4k 5:1 edit timeline. FCPX generated XML files to conform in Resolve and all of the matching was dealt with there.

Taronga ben allan post10

Ben said, “We decided not to crop the single camera footage at this stage because, due to its extreme aspect ratio, it gave Clara the option to rack the image up or down with what was essentially a complete 5:1 frame above and below the composed image. This was particularly useful in many scenes featuring adults and children at the same time. A shot of the adults often included a ‘free’ shot of the kids below the frame markers. We monitored and discussed the possibilities on set and made good use of the shots in the edit.

Colour Grade

“Using cameras from the same developer and manufacturer gave a us a major head start when matching the looks in the grade. The rough-stitched proxy shots had the custom LUT, mentioned earlier, applied for editing which we kept for the fine stitching later. But we turned it off for the final renders so that we could begin grading the Trident shots in their original LOG format.”

Unsurprisingly, the 5.4K 5:1 format also created a special challenge for grading. Although they were all monitoring at 4K, it still meant constantly choosing between full resolution or full frame, switching back and forth between looking at details at pixel-by-pixel and then zooming out to see the whole frame. The different lenses created an interesting challenge because he had chosen to use Angenieux lenses above ground and Zeiss down in HQ. Then they had to match both of those looks with the SLR Magic lenses on the Trident as well as with the Canon and Samyang and GL Optics lenses used for B-Camera and pickups.

Taronga ben allan post

“Despite all of these different lenses, matching them in Resolve wasn’t really a problem. I started with the Zeiss and Angenieux looks as a reference point. Although modern grading techniques can take you a lot further than the differences between these lenses, it’s still nice to have appropriate looks as a starting point even if it is only for some of the footage,” Ben said.

“We also needed to grade differently for the curved screen because light from one part of the screen would fall on other parts, causing a loss of contrast. This was handled through very heavy vignetting that varied shot by shot, depending on the emphasis at each moment.”

Visual Effects

The ‘showpiece’ visual effects in the film include the Wild Headquarters’s massive holograms and the rock wall revealing the fictional HQ entrance. Flame artist Phil Stuart-Jones at Vandal designed and composited these and many invisble VFX shots. For the holograms, he built up many complex layers of elements in the Flame to make them feel busy and complex, incorporating 2D graphical elements, 3D animal wireframes and live footage shot at the zoo and other locations. Much of this live footage was handheld HD video, shot as pre-production reference, which he stabilised and balanced to fit within the hologram. 

Taronga ben allan post2

The cleanup shots were divided between the two vendors, Vandal and Holy Cow. For example, while working on the stitching, Graham painstakingly rotoscoped a number of yachts and other boats from the opening shot that distracted viewers from the voice over’s history of the Cammaraygal people and their connection the land. The fact that the footage had been shot on the Trident from the drone – that is, on three cameras moving together in three dimensions - and included layers of changing colours in the water below, made this task challenging. Even so, audiences find the finished shot especially moving.