Wētā FX Appoints New Chief Executive Officer Daniel Seah
Wētā FX has appointed VFX executive, Daniel Seah, as new CEO, after serving as CEO, Chairman and Executive Director of major VFX companies including Digital Domain for the past 12 years.
V-Ray 7 includes native support for ray-traced Gaussian Splats, and improves shading and texturing with OpenPBR compliance in Maya and Houdini’s new Volumetric Shader.
Zibra AI’s collaboration with SideFX Labs brings real-time volumetric compression to Houdini, optimising storage and bandwidth and enabling high quality VFX workflows.
Foundry has developed more USD-native workflows for Katana's look development and lighting environment, a new 2D Painting Mode in Mari and the Nuke 16.0 open beta with multishot workflows
The Winter Release of the Maxon One software suite includes new features in Cinema 4D, Redshift, ZBrush and ZBrush for iPad, and Red Giant for VFX and motion artists, animators and studios.
Digital Domain appointed Gabrielle Gourrier as Executive Vice President of VFX Business. In this role, Gabby will focus on expanding client partnerships and establishing strategic initiatives.
Chaos V-Ray 7 for 3ds Max brings Gaussian Splat support for fast photoreal environments, new ways to create interactive virtual tours and more realistic, controllable lighting to 3D rendering.
VFX Supervisor Morgan McDermott at Impossible Objects talks about making their immersive film for UFC’s debut at the Sphere, combining heros and symbols of Mexican independence with UFC legends.
Chaos puts Project Arena’s Virtual Production tools to the test in a new short film, achieving accurate ICVFX with real-time raytracing and compositing. Christopher Nichols shares insights.
Moving Picture Company (MPC)has appointed Lucinda Keeler as Head of Production for its London studio, bringing over 20 years of experience and leadership in the VFX industry.
Visual effects studios MPC and Mikros have formed an alliance of their episodic and film divisions. Mikros, a French company with over 35 years in the visual effects industry, has been a Technicolor brand since 2015.
The combined studios will operate under the MPC Episodic brand and will continue to make PreProduction and VFX services available to the French features and episodic market, and beyond.
The strategic move to integrate Mikros’ episodic and film division with MPC Episodic bolsters MPC Episodic’s market presence as a full-service global VFX studio, adding to its established studios in Berlin, London, Los Angeles and Bangalore. Meanwhile, Mikros Animation and Mikros Advertising will remain as they are.
The Paris and Liège based Mikros studios are headed by MD – France and Belgium, Béatrice Bauwens, previously MD at Mikros VFX. Recent film and TV projects by the studios include ‘Into the Night’, ‘Lupin’, ‘Aline’, ‘The Forgotten Battle’ and ‘Annette’, which opened the Cannes Film Festival on 6 July.
“Bringing our creative and technical expertise to projects and getting involved with filmmakers to bring solutions has always been our driving force,” Béatrice Bauwens said. “We share with the MPC teams the same passion to find solutions that best serve the stories. The association of Mikros VFX and MPC Episodic is a great opportunity for the teams to have access to the world's highest technology and an exceptional community of talent for the benefit of our clients in Paris and Liège. We are proud to bring our French and Belgian touch to the MPC brand.”
MPC Episodic’s Global MD Tom Williams said, “I’m delighted to welcome the incredible talent of the Mikros VFX team. Joining forces affirms our commitment to provide the best VFX talent and technology to our clients anywhere in the world. France has a rich history of filmmaking and we are honoured to have the opportunity to work within the French market and offer our international clients access to the wealth of talent available in Paris and Liège, as well as more VFX production tax rebate options.”
Since its launch in January 2020, MPC Episodic has seen exponential growth, attracting some of the industry’s best creative talent including Executive VFX Supervisor Pete Jopling (The Third Day, Battleship), Executive VFX Producer Christopher Gray (The Witcher, The Boys), VFX Supervisor David Sewell (Chernobyl, Les Misérables) and VFX Supervisor Sheila Wickens (Doctor Who, Big Little Lies). MPC Episodic has since completed work for Sky / HBO’s series ‘The Third Day’, Warner Bros. Television’s ‘Pennyworth’ and ‘Lisey’s Story’ and is working with clients including AppleTV+, Bad Robot, BBC Studios and Amazon Studios. www.mpcepisodic.com
FuseFX visual effects studio, performing services for television, film and commercials from eight studios in North and South America and Australia, has promoted key staff members to leadership positions at its New York and Atlanta studios.
At FuseFX’s Atlanta location, launched in early 2020, Brian Kubovcik will now serve as head of studio and senior VFX supervisor. Brian is an Emmy-nominated VFX supervisor who joined FuseFX's New York office in 2015. Brian is collaborating with filmmaker Ava DuVernay on her projects ‘DMZ’ and ‘Naomi’ as VFX supervisor. He has previously worked on critically recognised series including ‘Pose’, ‘Invasion’, ‘The Blacklist’, ‘The Tick’, ‘Survive’ and ‘Mr. Robot’.
Brian is joined at the Atlanta studio with long-time FuseFX leader Lindsay Seguin. Lindsay has been a staple in the New York office since its inception in 2014. She makes the transition to Atlanta as head of production and executive producer. Along with Brian, she was part of the Emmy-nominated team for ‘Mr. Robot’, and served in prominent roles on other projects such as ‘When They See Us’, ‘Invasion’, ‘The Blacklist’, ‘Luke Cage’ and ‘American Made’.
Head of studio and senior VFX supervisor Brian Kubovcik
"I'm excited about leading this team of talented VFX professionals in Atlanta. For me, it's a special moment to be able to partner with my long-time colleague, Lindsay Seguin, to grow our Atlanta office together. Our goal is to harness and foster the talents of our valued artists to continue to deliver high quality work that our clients require," Brian said.
Lindsay also said, "I am super excited to partner with Brian in Atlanta. There is an amazing team here, and we are really looking forward to continuing to grow with them."
"Brian and Lindsay have been with FuseFX for a long time, providing leadership at our studio in New York," said FuseFX founder and CEO, David Altenau. "Having a couple of established veterans who have grown up within the New York market take the reins in Atlanta gives me great confidence about the team they are going to build there, and the visual effects they will produce. This is a well deserved opportunity for both of them to take new leadership roles within FuseFX and lead a new studio location."
Head of production and executive producer Lindsay Seguin
FuseFX in New York City
John Miller has been promoted to head of studio-NY and senior VFX supervisor. Working with him in new roles at the studio are executive producer Lauren Montuori and Ariel Altman, senior VFX supervisor and head of creative.
John has over 20 years of experience in television production, feature film production and post. He has created and managed visual effects, graphics and editorial departments at facilities in New York including Princzco Productions, Flicker VFX @ SMA, Moving Images, Point 360 New York and Perception NYC. His credits with FuseFX include ‘Mr. Robot’, ‘When They See Us’, ‘Iron Fist’, ‘Castle Rock’, ‘Tell Me A Story’ and ‘Sneaky Pete’. John has also worked on Marvel films, including ‘Avengers: Age of Ultron’ and ‘Captain America: The Winter Soldier’.
"I am excited to begin my new role as FuseFX New York's head of studio and senior VFX supervisor. I am looking forward to leading and growing the talented creative/production team as we continue to take on a larger number of complex projects, and strengthen our relationships with our global partners in the FuseFX community. Lauren, Ariel and I have been in the trenches on some of FuseFX NY's biggest projects, managing client creative and production expectations. Through our close working relationship, we have created a shorthand that has become invaluable as we move into our new roles," John said.
Lauren Montuori as executive producer has more than 10 years of experience in visual effects and animation. Her recent work at FuseFX includes contributions to ‘Prodigal Son’, ‘Black Lightning’, ‘Iron Fist’, ‘The Dangerous Book for Boys’, ‘The Get Down’, ‘The Blacklist’ and ‘Mr. Robot’, which earned her VES and Emmy nominations. Lauren's previous experience includes ‘Horton Hears a Who’, ‘Brave’, ‘The Girl on the Train’, ‘Winter's Tale’, ‘The Secret Life of Walter Mitty’, ‘The Adjustment Bureau’ and ‘Salt’.
"I'm looking forward to partnering with John and Ariel while moving into this new role. We have an incredible team, and I look forward to continuing to help support the company's growth and talent with them in New York," she said.
Ariel Altman, now senior VFX supervisor and head of creative, has been part of the New York team since its launch in 2014. Ariel’s 10 years of experience as a compositor and supervisor include credits spanning television, commercials, broadcast media and music videos. Recent work includes HBO's ‘I Know This Much Is True’, ‘The Blacklist’, ‘Kevin Can Wait’, ‘Bull’, ‘American Made’, ‘Luke Cage’, and ‘Mr. Robot’, for which he also received an Emmy nomination in 2018.
"Our focus on creative approaches in partnership with our clients has been the core of our growth since opening the NY office seven years ago. I'm pleased to continue this work in my new role and continue our mission of telling great stories. I couldn't ask for better partners in John and Lauren," said Ariel.
Dave Altenau said, "John, Lauren and Ariel have been key members of our team and led projects for FuseFX in New York for many years. All three are consummate professionals with proven track records delivering award-level visual effects, and now have an opportunity to take our New York studio to the next level." fusefx.com
DNEG is increasing its Canada-based VFX and animation operations and opening a new studio in the Greater Toronto Area. The company’s fourth North American facility, joining studios in Montreal, Vancouver and LA, will initially employ up to 200 people, in particular new technology positions, as well as VFX and animation roles. The new emplyees will initially work remotely, with the search for a physical studio location in the Greater Toronto Area now underway.
The opening of the Toronto studio is part of a wider expansion across Canada. They also plan to expand their existing Vancouver and Montreal offices with up to 300 new roles, including up to 100 new positions for our DNEG Animation team as it opens its doors in Vancouver.
People interested in joining the teams in Canada can find these new opportunities and job postings on the careers page here.
Commenting on the expansion, CEO Namit Malhotra said, “We are further investing in Canada by creating up to 200 new jobs in the Greater Toronto Area with the opening of our fourth North American studio. We are also continuing to build out our studios in Vancouver and Montreal to support our upcoming slate of VFX projects, and extending our successful DNEG Animation team to Vancouver as they move into production on five new feature animation projects. Growth in our Canadian talent and capabilities will help us strategically align with the demands of the entertainment industry and seize our new growth initiatives and content creation opportunities.”
DNEG’s Global CTO Paul Salvini said, “I’m excited that DNEG is bringing these new opportunities to the visual effects and technology communities of the Greater Toronto Area. As a resident of Kitchener / Waterloo myself, I’m well aware of the strength of the technology talent in this area. This is a great chance for technologists working in AI, machine learning, UX and across a broad spectrum of technology areas to refocus their talents on helping to create incredible imagery for some of the world’s biggest feature films and episodic series. We’re not fixed on candidates having previous film or media industry experience – we’re looking for passionate and curious technologists who are excited at the prospect of a new challenge.” www.dneg.com
Blender is becoming a standard 3D software application and, according to AMD, needs to be able to work in a larger workflow of other applications. Pixar’s Universal Scene Description (USD) has made exchanging data between 3D applications easier for artists, opening a robust, open way to exchange and assemble data from multiple applications. AMD says that Blender users should also be able to experience the same ease of use and dedicated experience with USD.
Blender USD Hydra
AMD has launched a project enabling USD data assembly and rendering inside of Blender. Brian Savery, Professional Graphics Software Development Manager for AMD said, “Blender includes a basic USD exporter, and soon will include import tools. However, there is no method of rendering existing USD data within Blender or referencing a USD file into your Blender scene. Other tools that support USD, such as SideFX Houdini or Autodesk Maya, also allow assembly and manipulation of USD data.
“Furthermore, while Blender users create intricate shader networks for its Cycles path-tracing render engine, they need a way to share shading networks with other applications. USD includes a rendering system called Hydra that allows multiple renderers with one common interface. AMD adapted this system to work directly inside Blender. By adapting Hydra as a render add-on to Blender, any renderer that supports Hydra can be connected to Blender by plugging into the Hydra add-on.”
Also, the Hydra system sends scene updates and rendering very quickly, which leads to better renderer performance than using common rendering add-ons to Blender. Currently this add-on includes the Hydra OpenGL renderer and the AMD Radeon ProRender plug-in for Hydra, though other Hydra render delegates should work equally well.
USD Scene Composition
Another important aspect of USD support is enabling USD scene composition in Blender. AMD achieves this with a custom node graph, allowing users to pull in external data to mix with Blender data and filter, manipulate and export USD data. This allows tools for pruning data, referencing data without loading it into Blender’s memory, interaction between multiple artists, and exporting composited scenes for offline rendering.
Similar to USD geometry, AMD handles materials using the open source MaterialX open standard. Created by Industrial Light and Magic for sharing material graphs across renderers, it is quickly gaining acceptance as the standard material format. This makes it possible to add material node graphs from Adobe Substance 3D Painter and various Autodesk applications, as well as export them.
MaterialX is a growing standard with increasing adoption across applications. To help encourage adoption, AMD plan to launch a free Material Library for sharing MaterialX materials on AMD’s GPUOpen.com. Users will be able to use it to download materials and import directly to the Blender Hydra plug-in.
The video linked here is presented by the author of the AMD USD plug-in for Blender as an overview of the main features. www.amd.com
Unreal Engine 4.27 is now available with updates supporting filmmakers, broadcasters, game developers, architectural visualisation artists, and automotive and product designers.
In-camera VFX
The use of in-camera VFX is now more efficient, with results of a quality suitable for wider applications such as broadcast and live events.
Designing set-ups in nDisplay, Unreal’s tool for LED volumes and rendering to multiple displays, is simpler to manage due to a new 3D Config Editor. All nDisplay-related features and settings are placed in a single nDisplay Root Actor to make them easier to access. Setting up projects with multiple cameras is also easier.
nDisplay now supports OpenColorIO, improving the accuracy of the colour calibration, which associates content creation in Unreal Engine with what the physical camera captures from the LED volume.
For efficient scaling in nDisplay, multiple GPUs are supported. This also makes it possible to make the most of resolution on wide shots by dedicating a GPU for in-camera pixels, and to shoot with multiple cameras, each with its own uniquely tracked field-of-view.
A new drag-and-drop remote control web UI builder is now available to help build complex web widgets without writing code. This makes it possible for users without Unreal Engine experience to control their results from the engine on a tablet or laptop.
Camera Control
Also, the Virtual Camera system built for Unreal Engine 4.26 now includes Multi-User Editing, a redesigned user experience and an extensible core architecture – that is, it can be extended with new functionality without modifying the original codebase. A new iOS app, Live Link Vcam, is available for virtual camera control – users can drive a Cine Camera inside Unreal Engine using a tablet or other device.
A new Level Snapshots function will save the state of a given scene and later restore any or all of its elements, for pickup shots or as part of an iteration phase. Users also have more flexibility when producing correct motion blur for travelling shots that accounts for the look a physical camera would have with a moving background.
Recently, Epic Games and filmmakers’ collective Bullitt assembled a team to test all of these in-camera VFX tools by making a short test piece following a production workflow.
USD, Alembic and Workflow Connections
With this release, it’s now possible to export a bigger variety of elements to USD, including Levels, Sublevels, Landscape, Foliage and animation sequences, and to import materials as MDL nodes. You can now also edit USD attributes from the USD Stage Editor, including through Multi-User Editing, and bind hair and fur Grooms to GeometryCache data imported from Alembic.
Datasmith is Unreal’s set of tools for importing data from various sources. In 4.27, Datasmith Runtime allows more control over how the data is imported, including access to the scene hierarchy and the ability to import .udatasmith data into a packaged application built on Unreal Engine such as the Twinmotion real-time architectural visualisation tool, or a custom real-time design review tool.
A new Archicad Exporter plugin with Direct Link functionality is available, and Direct Link has been added to the existing Rhino and SketchUp Pro plugins. Datasmith Direct Link maintains a live connection between a source DCC tool and an Unreal Engine-based application for simpler iteration. You can also aggregate data from several sources, such as Revit and Rhino, while maintaining links with each DCC tool simultaneously.
GPU Light Baking
Unreal Engine’s GPU Lightmass uses the GPU instead of CPU to progressively render pre-computed lightmaps, using new ray tracing capabilities of the DirectX 12 (DX12) and Microsoft's DXR framework. It was developed to reduce the time needed to generate lighting data for scenes that require global illumination, soft shadows and other complex lighting effects that are expensive to render in real time.
Also, since the results can be seen progressively, the workflow becomes interactive. Users can stop, make changes and start over without waiting for the final bake. For in-camera VFX and other work, GPU Lightmass means that virtual set lighting can be modified much faster than before, for efficiency.
VR, AR and Mixed Reality
Support for the OpenXR framework, ready for production, is now added to make creating extended reality content – VR, AR and mixed reality – in Unreal Engine easier. OpenXR simplifies and unifies AR/VR software development, so that applications can be used on a wider variety of hardware platforms without having to port or re-write code, and compliant devices can access more applications.
The Unreal OpenXR plugin allows users to target multiple XR devices with the same API. It now supports Stereo Layers, Splash Screens, querying Playspace bounds to determine what coordinate space to play the camera animation relative to. Extension plugins from the Marketplace are available to add functionality to OpenXR without waiting for new game engine releases. The VR and AR templates have a new design with more features built-in and faster project set-up functionality.
Containers in the Cloud
Epic Games has continued to develop Pixel Streaming, which is now ready for production and has an upgraded version of WebRTC. It enables Unreal Engine, and applications built on it, to run on a cloud virtual machine and to allow end users, anywhere on a regular web browser, to use it as normal on any device. 4.27 also has Linux support and the ability to run Pixel Streaming from a container environment.
This new support for containers on Windows and Linux means that Unreal Engine can act as a self-contained, foundational technical layer. Containers are packages of software that encompass all of the necessary elements to run in any environment, including the cloud.
Container support includes new cloud-based development workflows and deployment strategies, such as AI/ML engine training, batch processing and rendering, and microservices. Continuous integration/continuous delivery (CI/CD) can be used to build, test, deploy and run applications in a continuous process. Unreal Engine containers can support production pipelines, develop cloud applications, deploy enterprise systems at scale and other development work. www.unrealengine.com