From ILM Singapore, Creature Simulation Supervisor Eric Wong and Compositing Supervisor Mark Hopkins describe the roles they played to give ‘Rango’ its realistic, photographic look that tempts the audience’s eyes to believe in the characters and environments on screen.
LM Singapore devoted 100 artists, technical staff and production support for over a year to ‘Rango’ on modelling, rigging, creature simulation, animation, FX, digital matte painting, lighting and compositing. The facility needed to assemble a completely new team, most of who had never worked together or on a feature film before, and completed about 25 per cent of the work on ‘Rango’. The studio ran a specialised program for creature simulation, led by Creature Sim Supervisor Eric Wong. Twelve apprentices were trained as the core team for this department, with supporting artists recruited from other parts of the studio. Rango in Singapore The time difference between the two facilities meant that the San Francisco team could hand off work at the end of their day as the Singapore artists arrived at the studio. Dailies sessions were held as video conferences via Cinesync, which helped establish and maintain consistency across the sequences and let the team members share ideas with the production, especially VFX Supervisor Tim Alexander and the director Gore Verbinski. Creature Team
Lizard Skin They looked at the skin of lizards in photos, videos or real life examples if possible, especially at how their scales work, how light reacts to their skin, creating iridescence and sheen. Eric said, “All images were stored in a library to make side-by-side comparisons with our work especially at turntable stage. We’d be looking for very specific details, how light strikes an eyeball, for example, or the membrane covering that eyeball and its specific characteristics under the lighting of a scene.” They were mainly using Maya and ILM’s in-house software Zeno. Simulation Optimisation As the animators worked and questions arose about achieving a certain pose the director wanted, Ken would work with both the animators and the simulation artists to get the pose looking right, and the feathers streamlined correctly against the body. The teams would hold several sessions back and forth until the process worked and achieved what the director was looking for artistically. Hair and cloth render times could be economised in certain ways. Pre-simulations could be run on distant and background characters, or they could use quicker simulations because they were much smaller, allowing more time on foreground hero characters. “You have to budget your render and artistic capacity, often planning shot by shot where to put time and power,” Eric said. “Simulations can be optimised by lowering the resolution of a mesh or running a lower res mesh on clothing, while using the same simulation engine. You may not get as much wrinkle detail or dynamic movement, but only in crowds or the background of the frame where it matters less.” Lived-in Look The team had a lot of fun with his Hawaiian shirt, and had some Hawaiian shirts in the office to study. To Gore Verbinski, the way the shirt fit Rango was crucial. He wanted the length of the sleeve to fall just past the elbow, and the collar to lie slightly asymmetrically to make it look a little ‘lived in’. “In one shot, Rango unbuttons his shirt, flourishes it like a cape, and then puts it on and buttons it up again. The creature team had to figure out a way for the animators to make him look as though he were buttoning his shirt, while creating the shirt as a simulation. It was a typical task requiring simulation/animation team collaboration. The riggers, animators and simulators work very closely together,” said Eric. |
The digital clothing job is rife with details. When working with a certain type of cloth, the first step is to establish the cloth’s properties, and then do a simple walk cycle of a character wearing it to make sure it works and responds the way it should. Then it can be tested in actual shots but often the results aren’t so predictable. Also, new conditions emerge. He said, “Sometimes the director wants a scene to be windier or wants to alter the wind, causing an unexpected result. So we’ll do several runs of a simulation – first, straight out-of-the-box and used as originally intended. Then we’ll push it to extremes and figure out what has to change and be updated for new situations.
“Once feathers and hair were duplicated and rendered, they generally became the work of the look development department, but the simulators had to make sure that feathers rendered as multiple pieces of geometry didn’t interpenetrate, and sometimes we were able to simulate long strands of hair and performance-based hair in such a way to avoid intersecting with clothing. We do as much as we can to allow dynamic hair and feather performances, but always consult with look development to avoid penetrating limbs and clothing. Their team can also use some rendering techniques to make penetration less noticeable.” In-Camera Effects Pulling Focus To gain enough flexibility to control the way the components work together in the frame, the finish of all elements were kept to a high standard. “This lets you pull focus on any part of a shot and gives a director a larger degree of freedom. Specific characters could always be the most important features of a shot, although the entire environment was always fully designed and built, detailed and textured,” said Mark. In a sequence toward the end of the movie, Beans and Rango are trapped in an enormous water bottle in the background. While the Mayor speaks pompously in the foreground, the focus remains trained on the antics of the pair in the bottle, highlighting the comedy of the scene. The technique is evident in many shots but the use is subtle. The overall result lifts the characters from the photoreal background and helps them remain at the heart of the action. Saloon Bar Mark said, “Those bar scenes used a combination of techniques, as they might in a live action film, depending on what was right for the shot and how much camera motion was involved. Where there was more than 40° of camera movement around the central axis, a tracked in 2D element probably wouldn’t hold up and a 3D option would work better.” |
Perfect Light Because light was so critical to the looks of the production the team created different lens effects tools to control the light. Most of these tools were written within the Nuke framework, like the flare tool they worked on, for example. “When the camera is facing a strong light source, the lens works by flaring across the whole shot, not just the area around the light. Our plug-in allowed us to dial in the amount of flare. Certain glints or hot spots might pick up a lot more light than others, and can be emphasised with glints plug-ins,” he said. “Chromatic aberration is another look we sometimes cultivated. High quality lenses will allow less of this to happen but it can be quite attractive. In the image, the colour records get separated in a distortion. To create the effects, we take the red channel and scale it slightly in one dimension, misaligning it relative to the blue or green channels and inducing a chromatic aberration. It’s similar to what a prism does but a lens gives a different, more subtle result due to its design. “Also, the effect shows up more at the edges because we scale from the centre – it’s only at the edges that you start to see the slight drift between red and blue. For a viewer the result is not quite identifiable but the shot takes on a special look. Over the years, directors and cinematographers have used low-tech and even broken equipment to achieve specific results – a top quality lens wouldn’t affect light in this way.” Glass Distortions Returning to the scene with Rango and Beans trapped inside the bottle, the shots include a number of interesting light and water effects because the light is reflected off the surface of the water, either straight into the camera or again off the sides of the glass in a rippling effect. Meanwhile, the viewer is also looking through the glass and then through the water at the submerged characters, as well as at the water itself. Making Eyes ‘Rango’ was Mark’s first animated feature, as well as ILM’s, but because it was handled in such a filmic, live-action style, he was dealing with very similar issues and challenges and was also using live action elements in compositing. Element libraries were cultivated, built on existing libraries, to create Rango’s world, including a huge variety of dust, flames and different kinds of fire, caustics and the volumetric light shafts, among others. Dust was a major desert element and had to be created with many variations, each with its own characteristics. So the team created an arsenal of dust ‘characters’ and then learned when to use each one. Force Multiplier Mark finds the collaborative, iterative development process is very powerful, a force multiplier. “The director’s vision is communicated to the artists by the various supervisors but the artists’ ideas have to reach the director as well. For example, one of the compositors working on a scene with animals standing quite close to a glass window decided that you should be able to see their breath on the window. She added it, Tim liked the look of it and showed Gore Verbinski, who liked it as well and kept it in the shot.” |
Going Bats “Among the props were the Gatling guns the rodents were using from on top of the bats to shoot at Rango’s stage coach. As the guns revolved, the shells and bullets flying up and then falling to the ground needed another simulation, and also the flutter of the bats’ wings as they descend into the canyon. Rango’s ‘dress’, which he wears through most of the sequence required some interesting simulations to get him to fit inside of it while keeping it moving dynamically as he dangles from a rope attached to one of the bats, or runs up against the wall. These shots took a whole series of simulations.” The sequence took several months to complete as the different teams tackled different aspects or elements separately, working on shots that weren’t in production at any given time. “We generally do the simulations before the visual effects. For example, we did the pillar break-up, with the rocks, looks and timing, and got approval for that work. Then we moved the shots to the effects team to add dust, particles and debris. These would be rendered with the simulations and composited together,” said Eric. The colour palette was specifically developed and directed throughout the sequence from start to finish. The director was conscious of making the earthy red he wanted for the rocky walls look different on each side of the canyon to indicate which way the light was falling. Mark said, “This was a challenge that had to be managed from shot to shot. We sometimes had to cheat slightly and move the sun’s position to get the correct sort of shading on the walls. The sun might be in the correct position technically, but the resulting shadows may not work visually in the shot. Problems like this can occur on a film set as well. You have to light each scene as a hero shot and establish a flow between them as well.” Looking Real |
Words: Adriene Hurst Images: Courtesy of Paramount Pictures/ILM Singapore |
MORE FEATURE ARTICLESFeatured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you. Only $79 per year. PDF version only $29 per yearsubscribe |