The Moving Picture Company post production studio in London has recently
created some astonishing digital visual effects and animation for ‘GI Joe: Rise
of Cobra’ and a multi-national ad campaign for Evian, ‘Skating Babies’.
Backing up their efforts is a research and development team devoted to
proprietary software and CG innovation.
Digital Media World Magazine issue 118 out now

click for larger images

GI Joe: Design Intensive
“Half of MPC’s ‘GI Joe’ work and by far the most complicated was the underwater, full CG sequence. We had 70 to 80 shots with very little real, live action plates in them, just a couple of inserts,” said VFX Supervisor Greg Butler. “Whenever your work approaches full CG, the number of things you need to develop concept for and design is endless. Although I had supervised a handful of full CG shots on previous films, this was my first experience with such a high number of shots that were entirely created in the computer.”
When MPC started on GI Joe, shooting was just wrapping up. The Art Department was nearly closed. As MPC started production, the Department handed over whatever they had designed, such as a few early designs for the Sharks and Mantis fighter craft. For the MARS Cobra base and its underwater environment, all they had was a mood and general style. A lot of its detail still had to be worked out - the actual number of structures, the scope, and the rocks it was based in – and formed their most design-and-build intensive task. The main base was a series of roundish pods laid out in the front, and towers and other structures emerging from that.
After reviewing the previs turned over to them, Greg and the team realised that they would need a 3D build of the whole structure. The action was going to take place all around the base at various distances and deep within it. “I decided to go for a full 3D build at a resolution that would hold up for a majority of the shots we anticipated. According to the show schedule, by the time we had most of our animation approved, and therefore most cameras locked, the build would have to be ready for rendering. If we could have delayed the build process a few months, we could have gotten away with some sections of the build at a much lower resolution,” he said.

Asset Development

“A few shots that got very close to the base had additional projected matte painting work done to increase the resolution. The rocky terrain surrounding the base structures was modelled, but the shading and texturing was done using a series of matte painted projections. All the work was done in our standard asset and environment pipeline, which is Maya and Renderman based.”
They spent three to four months in the asset department on designing, modelling, texturing, look development and lighting tests. “One of the toughest challenges on GI Joe was getting all of the assets designed and built to a high level of detail early enough so that we still had enough time left to light, render and comp the shots. We also had a huge number of FX simulations to run, since every underwater shot required multiple FX elements and these could only be produced once the animation was approved.”

Underwater Environment

A major creative and R&D hurdle was establishing a believable underwater environment – specifically, what would happen to light, visibility, and to objects floating around to give a sense of volume? They started investigating as much real, scientific reference material as possible on light travelling through water and how the colour and value of light fall off or degrade as distance increases from a light source. They had to consider a number of factors.
For example, if the light source is above the water, as it shines down into the water it rapidly loses red values and intensity. By the time an amount of water lies between you and the source, very little light may be left. At the same time, for the camera watching, any light shining down on an object will lose colour and value as it bounces off the object and comes toward the camera. Combining these factors gave them a complex shading algorithm that would determine - given the distance of the light to the object in one direction, and to the camera in another direction - what would be left in terms of colour and value.

Epic Battles

“Director Stephen Sommers really wanted epic battles in this movie. He wanted to see 20 to 30 Sharks attacking a force of 30 to 40 Mantis craft, with explosions accompanying them and the base included in-frame,” said Greg. ‘Star Wars’ and ‘Thunderball’ were both key references for the climactic underwater battle sequence. The problem lay in incorporating what worked about both films into the same battle.
“‘Thunderball’ has great shots of scuba divers in hand to hand combat with constant streams of air bubbles reminding the audience of the role that water itself plays in the building sense of threat and violence. The ‘Star Wars’ films are an archetype of epic battles between large numbers of futuristic craft intercut with close-ups actors to keep the audience emotionally connected. These two ideas don’t go together easily due to the difference in scales and the lack of visibility in real-life underwater photography.
“For example, we had some shots of 60 foot long Sharks attacking the Cobra base, as seen from a camera placed a mile away. Just being able to see the ships attacking was a very big cheat of underwater lighting and visibility. Given that the clearest water anywhere only gives about 0.5km of good visibility, realistic underwater lighting wasn’t going to be an option.”
New Tools
So, their R&D effort into the fall-off of light in water had to be redirected. Instead of rendering with all those constraints in place and facing the difficulty of controlling visibility, they stripped the procedure back, and used that lighting component during compositing, letting the compositors pull elements back into the depth of water. “So, in the end, that work really just gave us a scientifically-based tool to treat the level of visibility completely artistically.
“A couple of other new tools the R&D team created were a curl noise field that effects artists could use to affect particles and objects without the higher cost of a full fluid dynamics simulation, an adaptive method for fast rendering of volumetric shadows, which we presented at SIGGRAPH 2009, and a RenderMan shader to attenuate a light’s colour and intensity within a virtual water volume.”

Underwater Explosions
Another issue for the team was the need to produce some dazzling ‘eye candy’. The battles gave them the opportunity to work on explosions underwater, where immense pressure creates a classic explosion-implosion followed by a slight drift. All the film references they looked at showed practical-based, real explosions in tanks, all at a smaller scale than the event they were depicting was meant to be. However, shooting even a miniature explosion underwater is very dangerous, and the cost and resources required to stage explosions across 100 shots necessitated CG from the start.
“For fluid dynamics work, we tended to use three methods. For underwater explosions and water surfaces, seen in a handful of interior ‘docking bay’ shots, we used Scanline’s Flowline software. For large air releases we used Maya’s fluid solver. For most other cases, such as bubble trails, we drove particles with various fields in Maya. Flowline worked very well for underwater explosions. It has a fast simulation engine that could be manipulated to give a sense of external forces pushing against the explosion. It was also able to simulate a good transition from fire to smoke.”
They did some initial tests with Flowline, telling the software to cause a mass of water to explode out and then ‘fake’ gravity, not going up or down but just moving in a kind of vacuum. Some of their early tests were very successful. They developed those and combined custom Flowline explosions – when they had a specific contact such as the side of the sub blowing out and needed to create the collision across the surface as a critical moment - with general battle scene explosions, for which they built a series of pre-rendered explosions from different angles that the compositors could drop in as required and track through. They were able to put many high-cost simulations into several shots just through a simple card approach.

Animating the Subs
The main underwater craft built for the film was the ‘GI Joe’ sub, a 900-foot classically styled submarine with a launching bay at the back for 48 Sharks, the smaller fighter craft engaged in most of the battle shots. On the MARS side, the villains, the sleek, modern, stretched-out Yacht and the Mantis attack craft all represented significant design work. MPC did have some starting designs for all but the ‘Joe’ sub. During the 1980s, the original Hasbro toy series included Shark and Mantis fighter craft, which influenced MPC’s designs.
Animation Lead Julio del Rio Hernandez worked on animating the subs to look fast and dynamic, but have a certain mass and weight at the same time. They are very large vehicles and couldn’t look like toys or small cars. As they began to seek references, the animators soon realised that aircraft engaged in jet fights and dog fights made more appropriate refs, because their movements better resembled the way their subs needed to move.

Shot Library
Their most challenging shot, one in which 25 smaller Shark fighter subs deployed from the main sub, presented a series of timing and compositing problems. First, as the main sub moved forward through space, the rig where the Sharks were gathered was rotating. Then the Sharks had to deploy into a very small space, following a pattern. The lead of the subs needed to be at a certain point at a certain time in the camera, so that the animators could blend the CG subs with the live action plate. Right after that moment, they had to blend again into the CG sub and reveal the huge Cobra base in the background.
The team built up an extensive library of flying cycles and fighting cycles, which they could use to complete several shots. For example, sequences taking place in the interior of the base that needed to show battles going on outside through the windows could be shot with green screen windows, and the library shots dropped in.

Evian’s Skating Babies

'Skating Babies’, a multi-national TV and online campaign for Evian, combined choreographed, roller-skating babies with music of The Sugar Hill Gang's ‘Rapper's Delight’. After Director Michael Gracey FROM initially described the project, MPC decided they would have to recreate a 3D baby in post-production, animated either with motion-capture or with the in-house animation skills at MPC. The team created CG baby bodies in Maya and carried out live action head replacement and compositing.
As a test, they filmed a baby on green screen performing the actions Gracey had specified – first bouncing down and doing the splits, and bouncing back up again. Next, they filmed a professional skater doing essentially the same moves, but performing while watching the film of the baby and trying to emulate the baby’s arm and leg movements.

Test Shot
These actions gave the VFX Supervisor Dean Robinson some actions with which to connect the two images and form an amalgamation replicating the actions of both. The resulting wireframe baby’s body was tracked into their background shot, textured and lit. The live-action head was tracked in, and the figure was graded and given shadows. This test shot was shown to the agency and eventually won the job for MPC.

Creating Super Tools
For many projects, MPC’s VFX team integrate 'off the shelf' tools into their pipeline. But in a field that is constantly changing, projects arise when they need to achieve specific goals. They now have a group of software developers and computer graphics specialists who have developed several in house proprietary tools.
ALICE, Artifical Life Crowd Engineering, is a tool-set designed for positioning and choreographing crowds, from a handful of characters to a disciplined army or swarming aliens. First developed for the crowds in Wolfgang Petersen's 'Troy', ALICE was further developed for 'The Chronicles of Narnia: Prince Caspian', where it was used to populate wide shots and backgrounds with creatures, placing and animating characters up into the foreground. Motion capture, keyframe animation and physics simulation can be used interchangeably as inputs, while it also allows individual ALICE agents to receive attention to fine detail.
PAPI, the 'Physics API' based on Havok, helps in scenes involving falling, colliding and constrained rigid bodies. Initially a scripting tool developed for 'Troy' and 'Kingdom of Heaven', a user interface has since been added to make simulations faster, while complex scenarios including breaking and shattering objects can be controlled by setting up 'events'.
Furtility is a surface dressing tool, primarily for creating photorealistic fur and hair for CG characters, but also feathers and grass. Combined with MPC's in house hair simulation software, artists have better control over fur movement and interactions including stiffness, weight and external and environmental effects such as wind, rain and movement. It created woolly mammoths for '10,000 BC' and characters in 'Narnia: Prince Caspian’.
ISIS comprises software for digital set reconstruction. A series of 2D still photos of a building can be built into a full 3D model. The results can be seen in set extensions for 'Sweeney Todd'.

Words: Adriene Hurst

Featured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you.Only$77 per year.
PDF version only $27 per year
Copyright 2009 Digital Media World for syndication pleaseCONTACT US