The team at Cinesite opens ‘Pirates of the Caribbean: On Stranger Tides’ with a thrilling chase through the streets of old London, led by VFX Supervisor Simon Stanley-Clamp. At ILM Singapore, VFX Supervisor Mohen Leo’s team gave Blackbeard his ability to capture pirate ships in bottles, trapped forever on the open seas.
f the over 300 stereoscopic visual effects shots Cinesite created for ‘On Stranger Tides’, roughly 200 were required for the daring and comical carriage chase scene through London. In fact, according to the production’s original storyboards, the scene was intended to be even longer but a large chunk of it was cut out before the shoot as the director Rob Marshall continued to change his mind.
First Cut “While awaiting the first edit, we went on set to gather what we needed to start designing CG buildings and environments to replace the very large blue screens in place at the three locations in and around London. We based the CG on existing architecture, often in side streets and mews. Well in advance of the plates arriving, we built 3D assets and tested our photogrammetry techniques in csPhotoMesh, the facility’s new software, to prove that we could capture stills of a building, model from the stills and create an effective environment.” Blue Screen Streets Street layout came from the stills, the storyboards and a few production designs, which Simon extrapolated. This helped them anticipate where they would shoot from. He would send back concept stills to production, showing the buildings he suggested to fill gaps and replace blue screens, sometimes with options. Few changes were made to these looks. The main criterion was that the buildings simply be completely unnoticeable and look ‘right’ for the situation. Pre-Composites “People walking through frame alone or in groups, shot on blue screen, were supplied abundantly for us to randomise in the plates, all shot in stereo. Later on, fire and smoke elements were shot in stereo as well. One critical smoke element was commissioned and shot only days before the sequence delivery. The story needed extra smoke as a neat escape route for Jack Sparrow, allowing him to swing undetected from a pub sign, but we didn’t have enough time for CG smoke development. We match moved the plate and layered up multiple bits of the practical smoke.” “The first 10 or 12 minutes of the film’s opening London shots, coming across the water and leading into the sequence in King George’s palace before the carriage chase, were part of our award, the film essentially begins with our work,” said Simon. “We were even responsible for modelling and animating Jack’s CG cream puff for the palace dining room shots, which gets tossed around until it gets stuck on the chandelier. This chandelier was a massive, weighty practical prop controlled by giant pullies and motors that had to support Jack. But it didn’t swing naturally, so apart from Jack’s rig removal, we needed to carry out variable respeeds to give its motion the expected look.” Variable Respeeds Simon feels that, at this stage in stereo development, respeeds are best kept to a minimum, especially the variable type. “Sometimes they have to come back and be redone to make sure they look right and match the cut. They can even affect the audio because of the lipsync. The editors were usually quite helpful on this,” he said. |
The Stereo Hurdle Creating CG elements for stereo 3D footage created a significant learning and equipment hurdle for Cinesite. The production shot virtually in parallel, slightly toed-in, which allowed convergence during compositing. In other words, convergence is not baked into the footage but can be pulled forward or back. The team extrapolated convergence data from the plate and, using 2D tracking, generated a stereo track and piped this into their Maya scenes. “This way, what we rendered is an exact match to each eye. Then we can use tools in Nuke, a nudge tool, for example, to adjust and fine-tune shots so that all elements sit at their true stereo depth. But generally, rendering and tracking tasks simply take twice as long, which made the whole project take substantially longer than it would have as a 2D project. However, this was something that the production and vendors were all aware of from the outset.” Tracking was usually done with 3D Equalizer or a Nuke track to finesse elements into place. Stereographer Equipping the facility was a major but essential step. All compositors now have their own 3D monitors and Cinesite’s main theatre has been converted to Dolby D. A dedicated suite with stereo viewing system was built for ‘Pirates’. “You can’t guess about the images. They all have to be checked in the same way that they will be viewed in cinemas. The stereographer needed his own suite with monitors as well. Virtually every project now has a stereo agenda or deliverable, and now we are set up for it,” said Simon. Changing Light Slight rainfall at Greenwich wouldn’t have been an issue except for the stereo factor. The Pace rig the production were using has an exposed polarising mirror at the front. If rain falls on it, the drops appear in the footage as floating artefacts in the foreground. Cleaning these out in post is very time-consuming and expensive. At the shoot such problems were sometimes handled with eccentric measures like driving the rig backwards down the road, which gave them reversed shots but kept the rain off the mirror. A very talented Jack Sparrow double performed many of the trickier stunts for actor Johnny Depp, and other stunt men stood in for him on specific manoeuvres. “But whenever he is recognisable as himself, it really is him – there were no face replacements for him. Instead stunt rigs were often used that required substantial digital removal from the stereo footage. In one case they had to replace a digital building to make sure the cable was totally cleaned out. Frog Engineering |
“The animation in particular took a long time to lock down. The frog drops onto the actor’s shoulder, falling from overhead, jumps from one side of the frame to the other and was such a tiny nuance of animation, in seven tricky shots. Although we kept the glass jar in the plate for the composite, we still had to model a 3D jar, building the glass with depth so we could create passes to reproduce the refraction and reflection that would occur with a real glass jar. “When you look through the jar at the frogs, they ripple and change shape accordingly. We spent months on it, starting before the shoot with about eight different designs in myriad variations but the director wanted a simple, realistic frog. We completed four iterations, some with exaggerated limbs or other parts but the result is virtually true to life, a real poison-dart frog with a puffed throat and lens movements in the eyes.” Walk the Walk “The cleanup was the tricky part, often needing a bit more than only the peg portion. The animation required a few trials to lock down while making sure that leg was perfectly straight! We weren’t match moving what the actor is doing but what it would look like if he weren’t bending his knee. No Magic “We had all lighting scenarios to deal with as well. When Jack clashes with Angelica, posing as himself, in the sword fight in the Captain’s Daughter pub, the fire-lit interior is nearly dark. As the pair fight in the rafters overhead, they are wearing complicated rigs against the smoky ceiling behind them. To clean out the wires, we replaced the roof in CG and we also built the CG barrels you see in the background.” Frozen in Time Although Mohen’s team had no one on set, he spent a month in San Francisco working with ILM’s supervisor there, Ben Snow, before the shoot. He took the opportunity to discuss technical approaches for the bottle sequences and what data they would need from set to complete the work in post. Their main sequence involved Blackbeard’s full collection of bottles and takes place below decks on his ship. To get started on looks and aesthetics, the team focused on live action props in the plates – some model ships and a cabinet holding empty bottles, which the production crew had supplied as practical elements for the wider shots. In CG, the ILM team replaced the models, built ships and environments for the existing bottles, and built a large number of CG bottles with ships and environments inside. Magic Moment In the end, the project went one step further when one compositor, Ben Warner, tried putting canon fire and small explosions on one of the ships as if it were trapped in an endless battle. The director really liked this idea – it wasn’t just capturing weather and time but also a narrative moment in the bottles, so that each ship possessed its own story. Lighting Interaction The representation of the bottles changed from shot to shot. For close ups, practical bottles in the cabinet were removed to make way for the fully CG assets, and for the camera when taking reverse angle shots looking back at the actors as they discussed the ships. In these close shots, the lighting interaction between the ship and the bottle needed to be carefully designed and controlled, which was best achieved when both were built in CG. Ocean Looks In all, the team completed over 40 of these ship and bottle shots. The shots were composed almost entirely of the plate material and their CG, with some rotoscope work from the plates and a few blue screen shots of Jack’s face shot later in the schedule. The CG pipeline at the facility comprises a combination of in-house and off-the shelf software. Modelling is done in Maya, creature simulations, effects and lighting are proprietary. Texturing, rotoscoping and paint are all done with a combination of tools. Compositing is in Nuke. Unforgiving “One challenge was simply how unforgiving stereo really is for a VFX artist. The small cheats and inaccuracies are no longer viable. The viewer’s brain knows naturally how to perceive stereo vision, thus every detail and mistake is instantly recognized. Even a pixel or two out of place between left and right eyes will spoil the depth of a shot. It demands complete accuracy and the quality control is time-consuming.” Independent Render Compositing was all done in Nuke. Tracking was, again, done with proprietary tools but the stereo made it quite complex, demanding absolute precision. The layout team was finding the stereo plates substantially harder to manage but they have now advanced their pipeline and feel they can move onto other projects with more confidence. “It’s not just the tools but the artists themselves who advance. We all learned how to judge stereo images with our own eyes – at first, we were all tempted to rely on 3D glasses for this, but it’s not always the best technique to identify errors. Once the eye and brain are trained, it can be much more effective to just flip back and forth between left and right images – without glasses. By detecting shifts between them, you can determine exactly where and why errors are occurring.” |
Meshing & Matching Head of visual effects technology, Michele Sciolette, led Cinesite’s efforts to build the stereo production pipeline and develop new tools to address specific challenges. These included csStereoColourMatcher, an automated tool to compensate for colour differences between stereoscopic image pairs. The environmental artists were using csPhotoMesh to rapidly build up the large CG sets. “csPhotoMesh is photogrammetry and 3D scene reconstruction software, which was useful for building the carriage chase sequence environments,” said Michele. “It is a simple, flexible way to capture geometry. Given a set of digital images of a static scene, it produces a textured 3D mesh accurately representing the scene geometry and 3D cameras matching the original photos’ positions. The function is automatic – you just drop all the images in a directory and run the command. This kicks off a reconstruction process on our render farm resulting in a 3D mesh and camera positions ready for texturing. “csStereoColourMatcher is also fully automated. Colour differences between stereoscopic image pairs can be caused, for example, by the beam splitter in the camera rig, or other factors that introduce significant colour shifts across different stereo views. Derived from vector-based analysis, we built it into the front end of all our compositing work for the film. It requires no user supervision and is completely integrated into the Nuke compositing system. We used it to colour balance more than 300 shots for the film.” |
Words: Adriene Hurst Images: Courtesy of Disney Enterprises |
MORE FEATURE ARTICLESFeatured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you. Only $79 per year. PDF version only FREEsubscribe |