'The Mandalorian' follows the journey of a lone bounty hunter in the outer reaches of the galaxy, beyond the influence of the New Republic, and serves as the backstory to the Star Wars bounty hunter theme.
Season 2 production wrapped in early February 2020. FotoKem delivered automated VFX pulls for the various vendors as post-production got underway, and final finishing was done at Company 3. The release date is anticipated in October 2020 on Disney+, where the franchise helped launch the channel’s streaming service in 2019.
Eduardo Eguia has filled the role of DIT since season 1, working with Directors of Photography Greig Fraser, ASC, ACS and Baz Idoine and, joining on season 2, Matthew Jensen, ASC.
Showrunner and series creator Jon Favreau has been interested in pursuing new techniques, similar to the VR methods developed for ‘The Jungle Book’ and ‘The Lion King’, to help the production visualise the final concept while still on set. Jon and ILM Visual Effects Supervisor Richard Bluff used the Unreal game engine to create virtual set extensions.
Automated Post Processing
Production for season 1 and 2 was based at MBS Studios in Manhattan Beach, California, on several stages, with some external shots captured on location using a built-up backlot. With plenty of room for stages at MBS Studios, the post-production teams could get to work near set and quickly generate the dailies for story and visual effects editorial.
On season 1, the near-set team had been led by Pinewood Post in support of DP Greig Fraser, with Scott Fox as the dailies colourist. In season 2, the dailies post and visual effects pulls tasks were separated out and handled by FotoKem, who brought their proprietary nextLab system near-set to back up and encode the camera native files captured on the ALEXA LF camera in ARRIRAW, and to generate the editorial deliverables.
In particular, FotoKem designed and installed a system that automates the visual effects pulls from the encoded ARRIRAW files and generate the OpenEXR plates for ILM and the other visual effects vendors. Greig Fraser initially chose the ARRI ALEXA LF camera system for ‘The Mandalorian’. Cinematographer Baz Idoine and Matthew Jensen, ASC continued with the same type of camera into season 2, both using ARRI’s new 4.5K ALEXA Mini LF.
Real-time Colour Grading
For both seasons of the show, Eduardo set up his DIT cart to process up to six cameras simultaneously, managing data migration and colour pipeline. “We applied custom real-time colour grading to each individual camera using Pomfort’s Livegrade Pro with an ACES workflow. From my cart I sent a colour corrected REC709 signal to the DP’s monitors, and to the Video Assist to distribute the colour corrected signal to the rest of set.
“In terms of looks, our starting point was a LUT, based on a film stock, that Greig Fraser created. From there, we did some adjustments to achieve his vision for the show and generated a base LUT. Whenever we needed to make changes, I created individual colour decision on top of this LUT, or CDLs, to achieve the desired look. These CDLs were sent to Scott Fox, the dailies colourist, to apply and balance the grades. The same process has continued with the two other DPs Baz Idoine and Matthew Jensen on season 2, except FotoKem applied the CDLs, and we use a show LUT from Company 3, who are supplying the finishing services for season 2.”
CODEX High Density Encoding
In setting up the workflow for season 2, FotoKem employed CODEX’s High Density Encoding (HDE) process. HDE generates a bit-exact, lossless copy of the original ARRIRAW native files but at 50 to 60% of the original size. Since this compression allowed dailies colourist Jon Rocke at FotoKem to store more of the ARRIRAW files near-set, he could rapidly deliver turn overs using their automated visual effects pull system. The HDE files were then supported through the entire post-production pipeline including final colour with Charles Bunnag, a finishing colourist at Company 3.
“It was impressive to see the difference in media processing between seasons 1 and 2, when HDE became widely available,” Eduardo said. “HDE helped the workflow tremendously, not only to speed up the process of managing the media but by improving the turnaround time to get the CODEX capture drives back to the camera department for re-use.”
This reduced the storage required to back up the media that was shot the previous day. “Before season 2 started, we discussed the value of deploying HDE with James Blevins, the post-production supervisor. We felt confident that using HDE would be an important workflow improvement for production and post-production, and it turned out to be the right decision.”
Inside the Volume
Industrial Light and Magic (ILM) was lead VFX vendor. Their artist Richard Bluff served as visual effects supervisor for the production working with a team of six other VFX Supervisors – James Porter, Hayden Jones, John Knoll, Alex Prichard, Steve Moncur and Jose Burgos – to guide the effects through the series. The production needed an enormous number of VFX shots created not only by ILM but also by Base FX, Image Engine, Important Looking Pirates, Ghost VFX, Hybride, MPC and Pixomondo.
The production made use of a massive virtual set called Stagecraft, in which a green screen volume measuring 20ft tall, 270 degrees around and 75ft across became the virtual filmmaking environment.
The Stagecraft background surrounding the volume is a set of enormous LED screens built with a very small pixel pitch between the LED elements on the panels. 3D background imagery is projected onto the screens, while light from the screens also falls on the actors and real props, and everything is captured in-camera with the live action. Thus, image-based lighting moves in-camera.
Pixel pitch refers to the density of pixels. Since pitch indicates the amount of space between two pixels, a smaller pixel pitch means there is less empty space between pixels, yielding a higher pixel density that supports higher resolution images.
Interactive Real-time Sets
Pixel density was a critical factor in making the images projected by the LED walls to look more photo-real, especially because these images were not static. The image shown on the LED walls was played back in real time, displaying a 3D scene that is directly affected by the movements and settings of the camera. If the camera moves to the right, the image alters accordingly as if it were a real scene, seen through real eyes.
The 3D background imagery was projected from an array of GPU-powered PCs, running the Unreal game engine and controlled by a team of technicians, making it possible for a high level of interactivity to run consistently in real time.
“The virtual backgrounds were astonishing, marking a significant change in on-set processes for our industry. There was an initial learning curve to it, but the results speak for themselves,” said Eduardo. “The team running the projections was in constant communication with the DPs, in effect forming a tight integration between the virtual scenes with foreground real elements. Their integration was also important for the ongoing coordination between the DP, the Unreal team, the gaffer and me, not only to achieve the best results, but for the steady blending of the looks. Any colour adjustments by anybody on either side affected the image universally.”
Where in the World
Eduardo had a professional 4K OLED Monitor set up directly next to the DP monitors, so they could watch the signal coming out of the cameras at high resolution and then immediately apply a matching colour grade to the individual cameras. This allowed the team to see live, in full detail, the combination of the real foreground with the virtual background.
He said, “Using the virtual set with 3D backgrounds is what has made the biggest difference to my digital imaging process, compared to anything I've worked on in the past. One day, you could be working in space, or out in the desert, the woods, a tunnel. All these set-ups were on same stage, sometimes happening on the same day.
“Advances in visualisation not withstanding, a huge number of talented people were equally important in making the story look real, from the art department to the animators who brought the characters to life, the special effects teams and the team manipulating the backgrounds. We learned a lot on season 1, but Grieg and Baz really understood the system from the very beginning, pushing it to achieve some amazing results. On season 2, the virtual volume was bigger, and the results were even more impressive.” x2x.media