Engine House in England created an animated short film in less than three months with a very small team while working from home. The project was adapted from a children’s book called Wylder by Maia Walczak, and rendered entirely as final pixels in Unreal Engine.
Earlier this year, Engine House was awarded an Epic MegaGrant after Epic noticed the animation work the studio was producing using real-time rendering. They were early adopters of GPU rendering with Octane when it was still a Beta release. Epic wanted to help the team keep working on their own IP, which they want to continue doing in the future.
Engine House is comprised of two animators, Mike Richter and Jason Robbins, and a producer, Natasha Price. The team produces 2D and 3D animation, VFX, immersive experiences and marketing content with a network of freelancers. The diverse types of projects they have worked on include animated cutscenes for ‘Assassin's Creed Chronicles: China’ and ‘India’, an immersive 360° video created with the help of astrophysicists from Exeter University, and their own short animated film The Ship, directed by Natasha as part of the Channel 4 Random Acts Project.
Outside the Comfort Zone
The team had followed Unreal Engine’s development for a while and, watching the short film called Rebirth, showing the Quixel Megascans library that is now included with the engine, they were impressed with the quality of the real-time rendering. They immediately watched a tutorial series that broke down how the shots were achieved, and started figuring out how to apply the same techniques to their own work going forward.
Once the pandemic started and the Engine House team were under lockdown, they took time to learn Unreal Engine by creating a trailer for an internal project. At first, trying to work outside their familiar 3D software and rely on the tools in Unreal instead met with a lot of confusion. “But we persevered and it became really fun to use,” said Jason.
Soon, the team was familiar enough with the software to use it to create a self-financed short film. Mike had been looking at the book ‘Wylder’ with his children. The story, told entirely in pictures without words, is about a boy and his father discovering the natural world together, and their relationship to it. He, Natasha and Jason were drawn to the illustrative style and the concept, and decided that the book would be the focus of the project.
The page you are looking is not published
From Animatic to Look Development
After obtaining permission from the author to transform the book into a short film, they gave themselves a deadline of only three months. The team was fortunate enough to get hold of the original illustrations as well, which they scanned, made into an animatic in Premiere Pro, and used to plan the transitions between scenes and the pacing, and understand how the story would play out.
For look-development tests in Unreal Engine, Mike modelled some of the main elements in 3ds Max, Mudbox and ZBrush, and also experimented with using the VR paint tool Tilt Brush to give a more illustrative feel to natural elements like trees and grass. Throughout the project Natasha, working as production coordinator and producer as well as on the story, kept a public production diary to document the process and experience.
Having the original illustrations from the book also meant they could use them for the textures. “We loved Maia’s use of very pronounced textures, especially the wood patterns in and on the cabin. It’s a key theme in the illustrations of the book, so it made sense to carry it into the design of the animation. The assets were textured in the engine itself, and then the scene was lit. “Each light adds to the depth and texture and brings out elements from the foreground and background,” said Mike. “We were able to add gradients, variation, areas of brighter volumetric light and colour, which is more like painting a picture with lights, in order to get a storybook feel.”
Sequence Structure
For the 3D blockout, Jason began by exporting the initial Premiere animatic to Unreal Engine to create the sequence structure. He was able to keep a semi-live connection between Premiere and Unreal Engine, so that if a shot was cut out in Premiere, it would automatically update in Unreal Engine. If edits were made in Unreal Engine, the EDL could be exported back to Premiere to update the edit there.
Next, he blocked out the main action against the animatic using 3D primitives to represent the characters, props and environments. Unreal Engine has a set of tools called Sequencer that is used to manipulate Actors, cameras, meshes, properties, lights and other assets within a scene and design their motion over time. Sequencer supports 3D transformations such as translation, rotation and scaling within a non-linear editing environment in which the tracks and keyframes are set out along a timeline.
Through that process he could set up the camera angles, which would go through to the final action, and figure out the scale for the world. “Because everything here is a virtual asset, created by an artist working in isolation – characters built by one person, sets by another, for example – we needed a reference to base the size of everything on, so that all assets were at an appropriate size relative to anything else.
“When we began the 3D blockout and camera setups, we created a reference object of the same height as the father, and scaled everything else relative to that. We chose a character because they are the most complex – it’s best to avoid having to scale them,” said Jason.
More Real-time Looks
As the actual assets were completed, mainly by Mike, they replaced the previs elements piece by piece. For the environments, he created a main 3D scene, along with flat background elements to lend an illusion of depth. Meanwhile, the team continued working on look development, one of their favourite aspects of the engine. Because creating in real time is more efficient, it encourages them to take time to be more creative as they work.
For instance, they experimented with a free 2D post-process material that simulates pencil-hatching. One challenge they encountered with this effect was that, when the characters moved, the surfaces appeared to slide beneath the hatching, which was distracting. Randomizing the position of the hatching on every frame to counter this was also distracting, so in the end they moved the hatching position on the characters on every second frame.
Mike said, “The stylised look of ‘Wylder’ was an ongoing process throughout the creation. It was of course really helpful to have the book to refer to, and the deeper we got into the project we would share ideas or notice new things that we might want to include. The hatching post process was a later addition, even when about half way through we only had a little hatching in some of the textures that we wanted to push. It also helped to tie it all together nicely.”
Taking Control
Jason also remarked that, similar to different painting techniques or materials, every piece of 3D software brings its inherent look and feel – through its program, the computer will do a lot for the artist. “If you place a light in the scene, the computer will calculate how that light falls on a face and where the shadows should go. If you don’t consciously make efforts to control that in some way then the end result will always look familiar to other work in the same software.
“In Unreal, however, you have a lot of control to not only control the 3D elements in the scene, but also apply effects to the rendered image as a whole. By using screen space effects such as the pencilled cross hatching to the shadows, you’re taking the visuals somewhere new.”
Character Effects
Once the main character modelling was completed – the boy and his father, a deer and a family of boars – they could think about character effects like hair, fur and clothing. Hair grooms were created in the Ornatrix plugin for Maya by freelancer Andrew Krivulya, author of a series of tutorials, including many on creating hair for Unreal Engine, and then fed into Unreal Engine’s strand-based hair and fur system and tested. To enhance the illustrative quality for the deer, the strands were made unusually thick to look like pencil strokes.
Until recently, artists have used card-based techniques to create hair for real-time engines, approximating the shape and motion of large numbers of individual hairs. However, using the groom system and tools in Unreal Engine allows rendering the individual strands to improve the visual fidelity of simulated hair in real time. Although no standard format exists for hair grooms, the groom system can use Alembic files to ingest data exported from a modelling application.
“The hair is critical for these characters because their actual features are so small that much of their performance would need to come from the way the eyebrows and the father’s beard move,” said Natasha. “Both of the main characters need to be extremely sympathetic, warm and emotive. We are including subtle details – the texture on their jumpers, the handful of silver hairs in Dad’s beard – as a means of expression. Without dialogue or complex facial features, how otherwise do we know how nice the Dad is if we don’t see him in a woolly jumper?”
Animation Import
As Jason rigged and animated the characters in Maya, which was the next stage in the project, he modelled 3D versions of the characters’ tiny, 2D eyes found in the book to use as blend shapes. The mouths were rigged with bones. “The eyes were actually a very simple setup,”Mike said. “In the book, the characters’ eyes are depicted as either dots, arcs, lines or a kind of comma shape. So we just made these shapes using the same topology so that we could blend between them. While the actual blending frames aren’t seen, there is a little motion blur. Had we just swapped out the shapes, that would have been missing.”
Then they brought all of the animation into the environments, adding final cameras, colour grading and post effects – all in Unreal Engine – and pulling the whole project together. Natasha described this as the fine-tuning stage, a last chance to polish the look and add the last 10 percent of quality.
Unreal uses an FBX pipeline to import animations, which must first be exported individually with one animation per skeletal mesh in each single file, although animations can be imported with or without skeletal meshes.
Outputting Final Pixels
Most of the cameras in this project are fairly static. The team could create a static camera angle that is used for the viewer’s POV during a shot, and then set up a trigger volume. This will transition the POV to a new static camera once the viewer passes the field of view, by creating additional Overlap Volumes and adding Camera Actors. For the views with slight camera moves, those moves can be set up and animated in the Sequencer.
The team carefully thought over the frame rate, and chose to render at 12 fps rather than 24 fps to give the piece a more traditional feel. In the early days of animation, a keyframe animator wouldn’t necessarily draw in-betweens for all 24 frames needed for one second of film – only very fast movements require such a high frame rate.
At that point they were ready to output the final pixels, with no other software required other than for credits and fades. “The image sequence was rendered out of UE4 and turned straight into an mp4 from there,” Mike said.
Cloud Server and Version Control
‘Wylder’ was essentially created by a team of freelancers – everyone working remotely from one another with groups of people to working on it simultaneously. Therefore, the team has set up their own cloud server via Amazon’s EC2 service. The artists use that, together with Perforce’s Helix Core version control system, which works alongside Unreal Engine’s Level Sequences and Subscenes system.
“For instance, one person can edit the lighting sequence in a shot while another person is editing the animation or cameras; when you both check in your updated file, everyone then has the latest complete version with all changes in place,” he said. “I can be working on a shot and spot an issue, and Skype Mike about it. He enters the system on his end and makes a change, while I carry on working. When he’s done, he Skypes me back to get the latest files and the fix happens in my project – without ever leaving Unreal.”
The ability to see near final-quality images very early on was some of the functionality that had interested the team the most about working with Unreal Engine. “While learning to work with GPU rendering with Octane, part of our pipeline has always been to aim to get a render that looks polished very early in a project,” said Mike. “We do this almost as a proof of concept, forcing us to ask questions early on about modelling style, materials, lighting and so on, long before we have all of the assets created.
“It really helps us to iterate quickly, which has a huge impact on the creative result and allows for more fluidity and even happy accidents as we go,” he said. “Unreal Engine takes this a step further for us by letting us see everything at once in Sequencer, when usually this would be a separate process—by which point it’s often too late to change anything.”
The Edge of What’s Possible
Jason said, “Once you have your shots set up, you have a lot of flexibility to realize your creative vision, without having to keep opening up other programs and rendering or importing external assets.” He also appreciates that you can keep all of your shots in one place, without having to open up a separate scene file for each. “It has the dynamism of a nonlinear editing package but with the power to actually manipulate the scenes in the timeline,” he said.
When it comes to the types of looks that are possible for creating nonlinear content, the team has found Unreal Engine to be extremely flexible. “The cinematography toolset is clearly made with real-world filming in mind,” said Jason. “The motion blur, grain, grading and exposure controls are really powerful. What’s more, it’s adaptable to any visual style.”
The Engine House team
Mike feels that the tools for games have advanced and accelerated enough to handle very complex scenes and lighting, and allow animators, filmmakers and other content creators to work directly in the end result rather than waiting for frames to render. “Furthermore, the quality is good enough for major studios to use in feature film or TV work,” he said. “You have immediate feedback on your work, and all your scenes can sit together without switching between files to work on different shots. The percentage of your workday that goes into directly improving final screen content has massively increased.
“There’s always been that struggle in the early stages of projects when a client is signing off on character images out of context, and a previz or blockout that doesn’t resemble how the final animation will actually look, and that’s hard for people who don’t know the process. Clients are having to sign off on things they don’t fully understand. Now we can get to the important conversations much earlier and actually all be talking about the end result.” www.unrealengine.com