V-Ray 7 for 3ds Max Supports 3D Gaussian Splats, Virtual Tours and Luminaires
Chaos V-Ray 7 for 3ds Max brings Gaussian Splat support for fast photoreal environments, new ways to create interactive virtual tours and more realistic, controllable lighting to 3D rendering.
VFX Supervisor Morgan McDermott at Impossible Objects talks about making their immersive film for UFC’s debut at the Sphere, combining heros and symbols of Mexican independence with UFC legends.
Chaos puts Project Arena’s Virtual Production tools to the test in a new short film, achieving accurate ICVFX with real-time raytracing and compositing. Christopher Nichols shares insights.
Moving Picture Company (MPC)has appointed Lucinda Keeler as Head of Production for its London studio, bringing over 20 years of experience and leadership in the VFX industry.
REALTIME studio has launched a Virtual Production division, following its grant from Media City Immersive Technologies Innovation Hub to develop a proprietary Virtual Production tool.
ZibraVDB plugin for Virtual Production and CG studios delivers high compression rates and fast render times, making it possible to work with very large volumetric effects in real-time.
Maxon One 2025 updates Cinema 4D, Redshift, Red Giant and Cineware, and releases ZBrush for iPad, putting ZBrush sculpting tools into a mobile device with a new UI and touch controls.
Das Element asset library software version 2.1 has new video playback controls, hierarchy tree customisation for libraries, faster set-up processes and simpler element migration.
Autodesk returned to SIGGRAPH 2024 to show software updates that include generative AI and cloud workflows for 3D animation in Maya, production scheduling and clip retiming in Flame.
Shutterstock launched a cloud-based generative 3D API, built on NVIDIA Edify AI architecture, trained on licensed Shutterstock content, as a fast way to produce realistic 3D models with AI.
Freefolk has promoted Rob Sheridan to VFX Supervisor in their Film and Episodic division and Paul Wight is now the company’s first Chief Operating Officer.
Golaem 9 includes a new animation engine, helps create new shots without simulations, and has better crowd control, helping artist improve the quality of their animated hero characters.
Blender is becoming a standard 3D software application and, according to AMD, needs to be able to work in a larger workflow of other applications. Pixar’s Universal Scene Description (USD) has made exchanging data between 3D applications easier for artists, opening a robust, open way to exchange and assemble data from multiple applications. AMD says that Blender users should also be able to experience the same ease of use and dedicated experience with USD.
Blender USD Hydra
AMD has launched a project enabling USD data assembly and rendering inside of Blender. Brian Savery, Professional Graphics Software Development Manager for AMD said, “Blender includes a basic USD exporter, and soon will include import tools. However, there is no method of rendering existing USD data within Blender or referencing a USD file into your Blender scene. Other tools that support USD, such as SideFX Houdini or Autodesk Maya, also allow assembly and manipulation of USD data.
“Furthermore, while Blender users create intricate shader networks for its Cycles path-tracing render engine, they need a way to share shading networks with other applications. USD includes a rendering system called Hydra that allows multiple renderers with one common interface. AMD adapted this system to work directly inside Blender. By adapting Hydra as a render add-on to Blender, any renderer that supports Hydra can be connected to Blender by plugging into the Hydra add-on.”
Also, the Hydra system sends scene updates and rendering very quickly, which leads to better renderer performance than using common rendering add-ons to Blender. Currently this add-on includes the Hydra OpenGL renderer and the AMD Radeon ProRender plug-in for Hydra, though other Hydra render delegates should work equally well.
USD Scene Composition
Another important aspect of USD support is enabling USD scene composition in Blender. AMD achieves this with a custom node graph, allowing users to pull in external data to mix with Blender data and filter, manipulate and export USD data. This allows tools for pruning data, referencing data without loading it into Blender’s memory, interaction between multiple artists, and exporting composited scenes for offline rendering.
Similar to USD geometry, AMD handles materials using the open source MaterialX open standard. Created by Industrial Light and Magic for sharing material graphs across renderers, it is quickly gaining acceptance as the standard material format. This makes it possible to add material node graphs from Adobe Substance 3D Painter and various Autodesk applications, as well as export them.
MaterialX is a growing standard with increasing adoption across applications. To help encourage adoption, AMD plan to launch a free Material Library for sharing MaterialX materials on AMD’s GPUOpen.com. Users will be able to use it to download materials and import directly to the Blender Hydra plug-in.
The video linked here is presented by the author of the AMD USD plug-in for Blender as an overview of the main features. www.amd.com
Unreal Engine 4.27 is now available with updates supporting filmmakers, broadcasters, game developers, architectural visualisation artists, and automotive and product designers.
In-camera VFX
The use of in-camera VFX is now more efficient, with results of a quality suitable for wider applications such as broadcast and live events.
Designing set-ups in nDisplay, Unreal’s tool for LED volumes and rendering to multiple displays, is simpler to manage due to a new 3D Config Editor. All nDisplay-related features and settings are placed in a single nDisplay Root Actor to make them easier to access. Setting up projects with multiple cameras is also easier.
nDisplay now supports OpenColorIO, improving the accuracy of the colour calibration, which associates content creation in Unreal Engine with what the physical camera captures from the LED volume.
For efficient scaling in nDisplay, multiple GPUs are supported. This also makes it possible to make the most of resolution on wide shots by dedicating a GPU for in-camera pixels, and to shoot with multiple cameras, each with its own uniquely tracked field-of-view.
A new drag-and-drop remote control web UI builder is now available to help build complex web widgets without writing code. This makes it possible for users without Unreal Engine experience to control their results from the engine on a tablet or laptop.
Camera Control
Also, the Virtual Camera system built for Unreal Engine 4.26 now includes Multi-User Editing, a redesigned user experience and an extensible core architecture – that is, it can be extended with new functionality without modifying the original codebase. A new iOS app, Live Link Vcam, is available for virtual camera control – users can drive a Cine Camera inside Unreal Engine using a tablet or other device.
A new Level Snapshots function will save the state of a given scene and later restore any or all of its elements, for pickup shots or as part of an iteration phase. Users also have more flexibility when producing correct motion blur for travelling shots that accounts for the look a physical camera would have with a moving background.
Recently, Epic Games and filmmakers’ collective Bullitt assembled a team to test all of these in-camera VFX tools by making a short test piece following a production workflow.
USD, Alembic and Workflow Connections
With this release, it’s now possible to export a bigger variety of elements to USD, including Levels, Sublevels, Landscape, Foliage and animation sequences, and to import materials as MDL nodes. You can now also edit USD attributes from the USD Stage Editor, including through Multi-User Editing, and bind hair and fur Grooms to GeometryCache data imported from Alembic.
Datasmith is Unreal’s set of tools for importing data from various sources. In 4.27, Datasmith Runtime allows more control over how the data is imported, including access to the scene hierarchy and the ability to import .udatasmith data into a packaged application built on Unreal Engine such as the Twinmotion real-time architectural visualisation tool, or a custom real-time design review tool.
A new Archicad Exporter plugin with Direct Link functionality is available, and Direct Link has been added to the existing Rhino and SketchUp Pro plugins. Datasmith Direct Link maintains a live connection between a source DCC tool and an Unreal Engine-based application for simpler iteration. You can also aggregate data from several sources, such as Revit and Rhino, while maintaining links with each DCC tool simultaneously.
GPU Light Baking
Unreal Engine’s GPU Lightmass uses the GPU instead of CPU to progressively render pre-computed lightmaps, using new ray tracing capabilities of the DirectX 12 (DX12) and Microsoft's DXR framework. It was developed to reduce the time needed to generate lighting data for scenes that require global illumination, soft shadows and other complex lighting effects that are expensive to render in real time.
Also, since the results can be seen progressively, the workflow becomes interactive. Users can stop, make changes and start over without waiting for the final bake. For in-camera VFX and other work, GPU Lightmass means that virtual set lighting can be modified much faster than before, for efficiency.
VR, AR and Mixed Reality
Support for the OpenXR framework, ready for production, is now added to make creating extended reality content – VR, AR and mixed reality – in Unreal Engine easier. OpenXR simplifies and unifies AR/VR software development, so that applications can be used on a wider variety of hardware platforms without having to port or re-write code, and compliant devices can access more applications.
The Unreal OpenXR plugin allows users to target multiple XR devices with the same API. It now supports Stereo Layers, Splash Screens, querying Playspace bounds to determine what coordinate space to play the camera animation relative to. Extension plugins from the Marketplace are available to add functionality to OpenXR without waiting for new game engine releases. The VR and AR templates have a new design with more features built-in and faster project set-up functionality.
Containers in the Cloud
Epic Games has continued to develop Pixel Streaming, which is now ready for production and has an upgraded version of WebRTC. It enables Unreal Engine, and applications built on it, to run on a cloud virtual machine and to allow end users, anywhere on a regular web browser, to use it as normal on any device. 4.27 also has Linux support and the ability to run Pixel Streaming from a container environment.
This new support for containers on Windows and Linux means that Unreal Engine can act as a self-contained, foundational technical layer. Containers are packages of software that encompass all of the necessary elements to run in any environment, including the cloud.
Container support includes new cloud-based development workflows and deployment strategies, such as AI/ML engine training, batch processing and rendering, and microservices. Continuous integration/continuous delivery (CI/CD) can be used to build, test, deploy and run applications in a continuous process. Unreal Engine containers can support production pipelines, develop cloud applications, deploy enterprise systems at scale and other development work. www.unrealengine.com
Autodesk’s Bifrost updates include virtual sliders for feedback port changes, unknown nodes for fixing broken graphs, expressive simulation graphs, and terminals for renderable geometry.
Animal Logic's USD Alab is a fully realised USD scene, intended to encourage further collaboration and exploration among the wider community into Pixar’s Universal Scene Description (USD). Animal Logic Group has now released USD Alab as open source software.
As an early adopter of USD, Animal Logic began transitioning their Sydney and Vancouver studios to an end-to-end USD based pipeline, starting during production on ‘Peter Rabbit’ in 2017 and completing with Peter Rabbit 2 in March 2020.
Seen earlier in their 2017 open source project AL_USDMaya, Animal Logic continues to promote broader USD adoption through the release of USD Alab, intending it to serve as a reference for many USD Concepts. “We believe USD to be a foundational tool for our industry and broader communities, and we encourage the release of open source assets to educate and inspire others to conduct their own exploration,” said Group Chief Technology Officer, Darin Grant.
While open source data sets exist aleady, USD ALab is one of the first real-world implementations of a complete USD production scene. It is a full scene description from global assets through to shot outputs, including referencing, point instancing, assemblies, technical variants, global assets and shot based overrides.
“There are two downloads available, including guiding documents and two sets of textures,” said Supervising Assets TD, Jens Jebens. “The first download contains the ALab scene assets themselves, derived from our production assets and conformed for compatibility to allow them to load in any tool that supports USD. The second download is an optional extra, a production rendering Texture Pack that delivers 4K OpenEXR textures with udims for production style rendering.”
“Beyond the USD assets, we’ve included documentation showing some new ideas and concepts from our experience using USD, including the idea of render procedural definitions, an extremely useful concept that we have not seen in USD to date,” Grant said. “We hope that this combination forms the starting point for future contributors to present their own ideas for discussion, promotion and, hopefully, adoption.”
“The ALab concept was born from Animal Logic’s Art Department,” said Head of Production Technology, Aidan Sarsfield. “Handed a brief for something ‘uniquely Animal’, the team came up with a great story that revolves around a secret backyard shed inhabited by a mad scientist of sorts. The resulting asset suite draws on the unique aesthetic that you’ll find in our studios, and there’s also some fun Easter eggs in there that link back to 30 years of the Animal Logic brand.”
USD ALab is also among the first sets of assets to adopt the Academy Software Foundation’s asset license. Animal Logic wanted to allow the broadest use of these assets to promote education, training and demonstration by students, studios and vendors. “Initially motivated by a desire to create unencumbered assets for our own demonstration and presentation purposes, we realised that the industry at-large could use something similar and pushed to release them,” Aidan said. “I’m excited to see how ALab develops in the community, particularly as we will be extending the data set over time.”
The USD ALab data set is now available and hosted here on Animal Logic’s website through Amazon Web Services. animallogic.com
Christy Anzelmo at Foundry has moved up to become Chief Product Officer, leading the delivery of Foundry’s product portfolio for Media & Entertainment and Digital Design. Christy first joined Foundry in 2015 and has been working as Senior Director of Product, heading product management and product design on Foundry's VFX compositing and review tools Nuke, Hiero, Cara VR and Flix.
In this new role of CPO, she will work with the company’s product management and product design teams on the vision and roadmap for Foundry’s products, taking responsibility for introducing new tools and software.
Christy studied at Cornell University and the University of Colorado Boulder. Her background in product design and business management extends over 15 years, and includes product leadership experience in fashion, electronics and software. Over that time, she has thought of her role in terms of developing creative ideas into successful products, and has focussed on that process throughout her career.
She said, “I have also been fortunate enough to work with passionate and dedicated VFX and animation artists and the teams that support them. They have been a continuous source of inspiration and motivation as innovation went ahead for the Nuke family.”
Striking a Balance
Christy Anzelmo, Chief Product Officer at Foundry
Christy and the teams she works with need to carefully balance innovation with the core improvements and maintenance. “That’s the ongoing challenge for established products like much of Foundry's portfolio. Because our products sit at the centre of pipelines, we tend to start from a baseline of technical and hardware requirements, following the VFX Reference Platform, for example.
“Then we build in core improvement projects that benefit the majority of artist workflows, and leave some space for the 'delighter' features – the ones artists aren’t expecting – and for niche workflows and experimentation. Since my background is in fashion, I tend to think of building a product plan like merchandising a collection. You need the right amount of black t-shirts and jeans, as well as trendy new ideas to excite customers and push the product line forward.”
Cultural Connections
She mentioned two aspects of Foundry’s culture that help manage this balance. “The first is the close connection between our Research team and the product development teams. Research tends to focus on longer-term technology trends, and product teams tackle mainly core engineering and shorter-term projects, but we are all focused on the same mission of accelerating artists and teams and we work collaboratively to deliver new technology to users.
“For example, the Machine Learning tools released in Nuke 13 came from a collaboration between the Research and Nuke teams. We could see potential in ML to accelerate image processing, often at the expense of artist control. To re-gain some of that control, the Research team started several projects, experimenting into whether implementing artist-driven ML training into Nuke would be feasible.
“Once the team established that this would be possible, the Nuke product designers, developers and product managers worked hand-in-hand with the Research engineers to develop this experiment into a production-usable toolset we could deliver in Nuke.”
The second area that Foundry’s product teams focus on is connecting and working with customers to stay on the pulse of what they need, in order to adapt quickly. Last year when the pandemic took hold, they quickly re-prioritised some longer-term projects related to remote review from Nuke Studio and Hiero, and roaming licensing features. As the industry shifted abruptly to remote working, those new tools and features were pushed out to clients straightaway.
Changes in the Workplace
Because the last 18-month period has resulted so much change for users, Christy sees her product management and product design teams’ roles changing as well. Nevertheless, she believes strong product management is crucial to keeping development efforts focused on what artists need. “That’s especially true during times when artist and studio requirements are evolving quickly. I expect we will push harder to collaborate and understand the artist experience, now that on site studio visits are less likely to happen for a while.
“With the explosion of remote work in VFX, in the next product cycles, I also see a greater emphasis on improving the user experience – that is, making Foundry tools easier to learn and use for artists working from home now, long term, as well as thinking about how we can enable more collaboration for remote teams. Luckily, we are seeing exciting technological developments that will enable new workflows of this kind, from USD to the expansion of cloud-based workflows.” www.foundry.com