VFX supervisor Kelly Port and digital FX supervisor Darren HendlerDD-maleficent12
share details of animating pixies, building a fairy’s wings and flying
through clouds for Disney’s fantasy, ‘Maleficent’.

Digital Domain Captures Pixie Dust for ‘Maleficent’

Digital Domainstarted work on‘Maleficent’late in 2012, shortly after principle photography had been completed from July to October that year. The film was shot mainly at Pinewood studios, with some location work. Digital Domain’svisual effects supervisor Kelly Portwas based in London for the project, and was able to spend time on set with Carey Villegas, the production's senior visual effects supervisor.

Digital Domain’s integration team also worked on set collecting set reference, HDR data and photogrammetry images for reconstruction of the sets and certain characters. It was important to record the set layout, location of the lights and positions of the actors. For exact measurements and spatial reference, a survey was taken with physical markers.

Three Pixies

One of Digital Domain’s main tasks on this movie, certainly the most challenging, was a creating a trio of tiny CG pixies - Flittle, Knotgrass and Thistletwit - based on three real actresses who also appear in their full-sized form in several sequences. The challenge lay in giving their pixie versions slightly altered, diminutive proportions and facial characteristics, animated to fly around and perform pixie behaviours while remaining completely recognizable and indistinguishable from the real versions of the actresses.


“Their animation was based on motion capture and extremely precise facial capture sessions with the actresses, which were held in October after principle shooting wrapped,” Kelly Port said. “All three pixie characters were captured performing together inside one, large volume to preserve genuine interaction between them. Digital Domain runs its own complete motion capture department at its facility in LA and fortunately were able to transport members of their own crew to work on the pixies, bringing some of their own specialized gear and hiring other equipment in London.

Body and facial data were captured simultaneously, again, to extract and preserve as natural a performance as possible from the actresses as they worked together, not isolated in a sound booth. Digital Domain’s custom facial camera set-ups are precisely adjusted to work for each actor, and are based on the company’s proprietary helmet cameras.


Capturing Magic

Production VFX supervisor Carey Villegastook a special interest in having actors play a digital likeness of themselves. “He had previously been working with Digital Domain on ‘Paradise Lost’, where considerable development work had been devoted to similar processes and he wanted to continue refining the systems,” saidDarren Hendler, Digital Domain’s digital effects supervisor. “Much of the methodology focussed on providing the actors with as natural an environment as possible to play out their roles together and then translate those performances to digital doubles.

“During motion capture, he felt the actors should all be together, playing off one another for timing, capturing face and body motion simultaneously as in live action, and also working with the director. They should be able to use the stunt rigs themselves wherever possible to let them fly around and add that true pixie quality to their own performance.”


To capture the actress’ physical characteristics in extreme detail for their 3D models, Digital Domain accessed equipment at theInstitute of Creative Technologiesat University of Southern Caliornia, with whom the company has kept a working relationship for many years. Using the ICT’s scanner, which produces very accurate photogrammetric imagery of surfaces, the fine details of each pixie were captured down to the pores of the skin, at very high resolution.

A separate texture map was also captured and created using polarized filters and a series of high-res images for colour and specular values. This kind of imagery is now standard for digital double production. The ICT also has the Light Stage dome, which records a range of lighting conditions that can be used as precise lighting reference in post production.

True Performance

Armed with this data, each pixie went through a controlled design process. Kelly said, “In the story, we first see them in their small pixie form, about one-third the size of an adult, before they are transformed by magic into their human versions and eventually return to pixies. Our artists strove to find an ideal balance between the real actors’ looks and the typical characteristics people associate with pixies – a slightly larger head, large eyes, childlike ‘cuteness’ and a body that is proportioned to look as if it could fly.


“In the meantime it was important not to lose their original looks but only to stylize them, altering proportions and changing the set of the eyes. Concept art from the movie’s director Robert Stromberg was stylized to the point of looking almost cartoonish, so Digital Domain’s initial designs looked this way, too, but we later migrated back to something much closer to reality.”

They also struck a balance between reality and fantasy in the performance. The rig attached at the waist, unlike a pixie’s wings which attach at the shoulder, and stunt coordinators controlling the rigs – either on poles, wires or both – in some ways became a problem for animation. However, the performance capture did produce quantities of great reference for recreating each actor’s approach to her role. For example, Flittel the blue pixie liked to kick her feet with small bursts of energy and other nuances that came through very clearly in the data. Thistletwit liked to spin and twirl.


Preserving these traits from the motion capture was very valuable even though the problems – weight distribution or the rig operators interfering with the performances – had to be addressed with keyframe animation and clean-up. All other shoots such as pickups could all be enhanced with the personality traits derived from the sessions plus keyframing. Some of the detailed previs, prepared mainly by The Third Floor, could be incorporated into the animation by enhancing it in this way.

Facial Action Coding

The most intense effort was applied to the pixies’ faces. The Facial Action Coding System, FACS, was used in which a scan is made to record exactly what a person’s face looks like at one precise moment. The process had to be repeated for every one of each pixie’s expressions, a demanding operation but necessary in order to break down each one’s face into the different muscle groups and define what happens to them during each of those expressions. This meant capturing the actress in 300 to 400 different facial moves. Some expressions related to emotions but others were purely technical – lifting a certain muscle or stretching an area. They sent the actresses training DVDs to prepare them for the FACS recording sessions.


“The next step was equally challenging and also critical to what the audience sees,” said Darren. “We had to figure out how to wrap this huge collection of muscle and facial data into a software product the animators could readily use for the pixies. It had to be something quite simple to run on the facial capture itself, such as dialling in a smile that will instantly look just like that actor’s smile.

“A massive amount of work goes on under the hood on those tools, let alone on the animation they are used to create. We were in effect preparing a complete version of the actor’s face rig to encompass thousands of different expressions – plus controllers, skin jiggles and other flesh dynamics. Our aim throughout was to match the actor 100 per cent at every moment.”

Pixie Hair

Thistletwit’s hair, a mass of curls, was an interesting challenge. This hair was created withSamson, Digital Domain’s proprietary hair software, which handles hair in much the same way as a person cuts and grooms real hair. In order for it to work and respond like physical hair, specialized artists groom and pose it prior to the simulation, which ensures that once the actress is in motion and moving her head, each hair flows naturally with her motions and interacts with other hairs and objects realistically.


As visual reference to compare their CG with for Thistletwit, they studied videos of the live action character wearing her wig while jumping and twisting about. How does hair like hers move at different moments, in different contexts? How do the curls get their volume? Comparisons and adjustments were repeated through numerous iterations. When the team revisited Samson to decide what they needed for this project, they made it work in a more procedural way. They had never encountered such complex hair, a mass of coils that produced uneven clumping, flyaway strands and unexpected interactions between the hairs.

To complete some of their shots, the Digital Domain team also built a digital double of Maleficent herself requiring the same type of high-res scans as the pixies. Here, the primary challenge was not her face - which was always live action or seen at a distance in non-speaking shots – but her wardrobe.

Costume Drama

The production had asked the artists early in preproduction to recommend a type of clothing and material that they could recreate digitally and work with effectively without too many complex effects. In spite of their recommendations for something fairly close fitting and controllable, the wardrobe was made of varied, thin light materials in flowing layers that rippled in the wind, incorporating hair, bone and more feathers. At least three full costumes were needed for Maleficent as an adult, another as a teenager and another as a young girl. The costumes were beautiful but each one required simulation tests and careful fabric matching. As with the pixie’s hair, the character effects team took these simulations right through the animations, adjusting as they went.


“The fabric was sometimes a problem on set, getting tangled with rigs and wires, or the rigs might be visible through the material,” said Darren. “We had to build a hero CG wardrobe to replace parts of her body where rig clean-up wasn’t possible. We had detailed copies of all the materials, broke down the wardrobe panel by panel, and scanned Angelina in full costume at the character and texture shoot. Then the CG elements were animated to precisely follow her performance. From the very beginning, we made side-by-side tests of the animated wardrobe against Maleficent performing.

The pixie’s clothes represented a more exotic set of tasks because they were made of flower petals, leaves, bark and other natural materials that had to move and function like clothing but keep their identifiable, organic traits like veins and ridges. The question here was how to design them to move dynamically with the characters but not break or look rigid. Darren said, “We actually tried making up pieces of a wardrobe with real plants to see what would happen in real life. It took a lot of experimenting especially when the fuzzy, organic materials were interacting with Thistletwit’s curly hair style.


Living Wings

Maleficent’s wings were critical elements of the story and involved the work of several teams at Digital Domain. As a help and guide to the CG and compositing teams, Angelina and the younger actress playing Maleficent as a girl had prosthetics attached to their backs in the position where the CG would be attached. This was photographed from many angles with tracking markers to suit any shot and body position. They sometimes wore practical wings on set as well but these served mainly as a guide for where and how they would be positioned in shots and were not actually used in the final movie.

“The director wanted the wings to have personality and also exist as an extension to Maleficent’s own character, mirroring how she felt but retaining a life of their own,” Kelly said. “They consisted of hundreds of feathers, each one individually modelled and detailed. A key problem was collision of the feathers with each other and interpenetration. Many of the initial collision errors were prevented by the animation rig itself, but not entirely, so that individual adjustment was needed. We would first animate the wings with a lower resolution rig to define the moves, and then go back over the shots working at a higher resolution to look for feather errors and enhance their looks.”


The rig allowed the wings to bend, flex and perform freely and also allowed the artists to control the shape, feel and look of the long flight feathers at all times, arranging them with the smaller feathers to avoid interpenetration. Darren said, “Unlike real birds’ wings they did not have a predictable set of moves, and Maleficent required complete control over every shape they assumed in any given pose. At times the animators had to make certain poses and positions that real wings of this design could never really accomplish, such as fitting into small spaces, look correct and natural.”

Ruffling System

Design reference for Maleficent’s wings included eagles, but the actress Angelina Jolie also liked the look of some slow motion footage of an owl in flight, especially at moments when the feathers ruffled in the wind. The artists were able to add a ruffling system to accommodate individual feather movements. “It’s an example of a controlled procedural system. Created in Maya, it isn’t a strictly physically accurate simulation because you can control how far it goes and give it different patterns or random noise patterns to simulate the wind blowing,” Kelly said.


Lighting the wings started with the light Maleficent was shot in for key lights and fill, using HDR-based environment lighting for realism and tweaking the HDR map from that point. This approach allowed realistic anisotropic effects – light reactions on surfaces that change depending on the angle they are viewed from – on the feathers for close-up detail of the wings. Maya is the basis of their pipeline for modelling, rigging and animation, lit and rendered in V-Ray. Water and magical effects, waterfalls and debris were generally created in Houdini, rendered in Mantra. Compositing was carried out with Nuke.

New Territory

The environments we see when Maleficent is introduced to viewers as a young teenager were another of Digital Domain’s tasks. She flies over a lush rock and canyon landscape, over water falls, streams and trees and up into voluminous cloudscapes – all rendered in full native stereo.

This entirely CG landscape had to match the art direction of MPC’s landscape in the opening sequence, although Maleficent is seen flying over different territory that demanded a lot of water effects and simulations. All together, though they had a dedicated environment team who built the geometry and matte paintings using Maya and Nuke, the sequences involved elements from several different teams at Digital Domain. The water splashes, waterfalls and mist were from the effects team, mainly using Houdini. Matching the art direction meant adding extra CG flowers, trees and foliage, requiring diverse compositing techniques including deep composting. Using the lighting as well to pull the look together, all elements were worked into one coherent package by the environment team.


The movie itself was shot monoscopically but full CG sequences like this benefit from being rendered out with a second stereo eye so that they do not need to be dimensionalised later on. But it means that 2D elements or painting were not options, and everything had to be created with CG imagery.

Digital Domain has considerable experience in water effects but having to share a shot in which delicate, fast-moving carriage faeries, created by the team at MPC,  jump into and out of their complex CG water added an interesting challenge to the work, especially considering the stereo factor. Light rendering, animation and compositing were all involved.

Darren said, “To start, we used previs to gain a basic idea of the animation they could expect, and then later on MPC used our cameras and environments to complete the animations, which they passed back to us as Alembic files that revealed the geometry and precise location of each fairy, frame-by-frame. This information was used to run the water simulations so that the water would flow and splash very accurately around the geometry. Knowing exactly where the characters are in 3D space allowed a perfect composite.”


Over the Clouds

Clouds have been a specialty at Digital Domain for several years. The 2013 film ‘Ender’s Game’ was a chance to develop their skill in cloud systems further, and made the cloud work in Maleficent much more straightforward. “It takes time to set up these systems,” said Kelly. “They are fully volumetric and also involve elements flying in and around them – in this case, Maleficent. Therefore we needed a proxy or low-res version of the clouds to pass to the animators so they could be both precise and creative with their performances on the digital double, moving in and around the cloud, along the cloud canyon walls and so on. What was tricky was determining exactly where a cloud really starts and where it ends. They are also massive so that when you are close to them it becomes hard to define edges.

Robert Stromberg had specific ideas about the clouds. They had to be both realistic and art-directable. As an art director and visual effects artist himself, he could take frames of the shots they were working on and paint directly on top of them to show the exact type and shape in the clouds what he wanted. This had benefits for the VFX artists but was demanding.


“He has such a good eye,” Darren remarked. “As well as up close, the clouds needed to be rendered with several different, dynamic lighting conditions from sunrise through to sunset, producing a range of looks. We had to bring a view or version of the clouds from Houdini, where they were created and rendered out of Mantra, into Maya, the animation software, not only so the animators could see what she is flying through but also to make the clouds interact with her – leaving trails around them or punching through them.

“It took a lot of back and forth and was complicated by the fact that Maleficent was flying much faster than any real creature could ever fly. If our tools were aligned to physically correct speeds, simulations involving a character flying at 1,000 mph won’t have the desired look and feel. We still had to convince the audience.” http://digitaldomain.com/