Unreal Engine 5.2 is released with three Experimental features, updates to both the virtual production tools and the VCam system, a new ML Deformer sample and native Apple Silicon support.
Unreal’s 5.2 update includes an early version of a Procedural Content Generation framework (PCG) that can be used directly inside Unreal Engine without relying on external software. The framework includes in-editor tools as well as a runtime component.
The user can use the PCG editing tools to define rules and parameters to populate large scenes with selected Unreal Engine assets, so that creating large worlds becomes faster and more efficient. The runtime component allows the system to run inside a game or other real-time application, making the world responsive to gameplay or changes to geometry.
Procedural Content Generation framework
The PCG tools can also be used for linear content requiring substantial numbers of assets, such as large architectural projects or film scenes. This framework is an Experimental feature that will be further developed over future releases.
In this release, Substrate is a new way of authoring materials that gives more control over the look and feel of objects used in real-time applications such as games, and for linear content creation.
When enabled, it replaces the fixed suite of shading models with a more expressive, modular framework that gives a greater range of surface appearances and diverse parameters to work from. It is useful for defining layered looks, for example, liquid on metal or dust on clear coat.
As another Experimental feature, the developers do not recommend using Substrate for production work but are looking for feedback to to help continue refine its functionality. To test the feature, users can enable it in their project settings.
iOS app for ICVFX stage operations
For filmmakers using virtual production, a new iOS app for ICVFX stage operations that complements the desktop ICVFX Editor is coming soon for iPad via the Apple App Store. It will have a touch-based interface for stage operations such as colour grading, placing virtual light cards to help light your subject and nDisplay management of real-time, synchronised rendering across multiple outputs, from anywhere within the LED volume. This allows a filmmaker to take responsibility for achieving the desired look while filming is still underway, without having to involve the Unreal Engine operators.
Updates to Unreal Engine's VCam system increases the scope for creative decision-making during pre-production. For instance, filmmakers can now operate multiple simultaneous Virtual Cameras through a single editor instance, and have the ability to create more complex, layered camera moves.
A third Experimental feature extends nDisplay support for SMPTE 2110, starting from the initial work begun in Unreal Engine 5.1 regarding the use of upcoming ICVFX hardware. As more hardware becomes available, this feature is suitable for testing now in Unreal Engine 5.2, and it is expected to be viable for production in Unreal 5.3.
Native Apple Silicon support
Native support for Apple Silicon has been added to the Unreal Editor in order to improve the user experience, performance and stability. The Universal Binary build of Unreal Engine that natively supports both Apple Silicon and Intel CPUs is now available to download from the Epic Games launcher.
A new ML Deformer sample is available for users who want to explore how Unreal Engine machine learning can be used to create a high-fidelity real-time character for PC and consoles. It will feature deformations driven by full muscle, flesh and cloth simulation. The sample download includes an interactive demo sequence that shows muscles bulging and sliding under the skin, and folds forming on clothing.
You can also compare the results with ML Deformer on and off, and animate the model with the included Control Rig asset. The package includes source assets to repurpose and modify for use in users’ projects. www.unrealengine.com