Unreal Engine 4.27 is now available! This release offers something for everyone, from filmmakers and broadcasters, through visualisation specialists in architecture, automotive, and product design, to game developers and beyond. Here are just some of the major highlights.
Next-level in-camera VFX
With this release, in-camera VFX is no longer just something to aspire to. A slew of improvements to the efficiency, quality, and ease-of-use of the toolset mean that this game-changing virtual production technique is now far more attainable. Many of these enhancements also have applications in other industries, especially broadcast and live events.
It’s now much easier to design your nDisplay setup for LED volumes or other multi-display rendering applications, thanks to a new 3D Config Editor. Plus, we’ve consolidated all nDisplay-related features and settings into a single nDisplay Root Actor for easy access. It’s also much easier to set up multiple cameras.
Support for OpenColorIO to nDisplay, ensuring accurate colour calibration that connects content creation in Unreal Engine to what the real-world camera sees on the LED volume.
To enable nDisplay to scale efficiently, added is multi-GPU support. This also makes it possible to maximise resolution on wide shots by dedicating a GPU for in-camera pixels, and to shoot with multiple cameras, each with their own uniquely tracked frustum.
A new drag-and-drop remote control web UI builder enables you to quickly build complex web widgets without any code, so that users without Unreal Engine experience can control the creative results generated by the engine from the convenience of a tablet or laptop.
The Virtual Camera system introduced in Unreal Engine 4.26 has been significantly enhanced, with support for many more features including Multi-User Editing, a redesigned user experience, and an extensible core architecture. There’s also a new iOS app, Live Link Vcam, dedicated to virtual camera functionality. The system enables you to drive a Cine Camera inside Unreal Engine using a device such as an iPad.
Rounding out the major enhancements in this arena, new Level Snapshots enable you to easily save the state of a given scene and later restore any or all of its elements, making it easy to return to a previous setup for pickup shots, or during creative iterations. There’s also new flexibility for producing correct motion blur for traveling shots that accounts for the physical camera with a moving background.
Recently, Epic Games and filmmakers’ collective Bullitt assembled a team to test out all of these tools by making a short test piece to mimic a production workflow. That project is now available as a free sample for you to download and experiment with.
Significantly faster light baking
In-camera VFX is just one of many workflows across multiple industries that will benefit from the many enhancements to GPU Lightmass in this release, which include support for many more features, together with increased stability and reliability. The system uses the GPU, as opposed to the CPU, to progressively render pre-computed lightmaps, leveraging the latest ray tracing capabilities with DirectX 12 (DX12) and Microsoft’s DXR framework.
GPU Lightmass significantly reduces the time it takes to generate lighting data for scenes that require global illumination, soft shadows, and other complex lighting effects that are expensive to render in real time; in addition, since you can see the results progressively, it’s easy to make changes and start over without waiting for the final bake, making for a more interactive workflow.
For in-camera VFX, GPU Lightmass means that virtual set lighting can be modified far more quickly than previously, making productions more efficient and ensuring the creative flow is not interrupted. Film crews can now take a coffee break instead of breaking for the day.
Stunning final-pixel imagery made easy
If you’re looking to create gorgeous still images or movies, be it for architectural, automotive, product design marketing deliverables, or for any other purpose, we think you’ll love the latest version of the Path Tracer, a DXR-accelerated, physically accurate progressive rendering mode in Unreal Engine that can be enabled without requiring any additional setup.
While it was previously suitable for ground-truth comparisons with real-time ray tracing, in this release a sweeping range of enhancements make it viable for creating final-pixel imagery comparable to offline renderings, with features such as physically correct and compromise-free global illumination, physically correct refractions, feature-complete materials within reflections and refractions, and super-sampled anti-aliasing.
While on the subject of creating imagery, whether with the Path Tracer or otherwise, you can now use Movie Render Queue to render from multiple cameras as a batch process, without having to go through complicated Sequencer setups. This makes it easy to repeatedly create a series of large stills from different viewpoints, as you work through variations or iterations.
Smaller, faster, better games
With RAD Game Tools becoming part of the Epic Games family, the Oodle Compression Suite and Bink Video codec are now built into Unreal Engine, putting some of the fastest and most popular compression and encoding tools in the industry into the hands of Unreal Engine developers for free.
Oodle Data Compression provides the fastest compression for game data – significantly faster than other codecs. It’s also the highest-ratio compressor, resulting in smaller file sizes and faster loading of packaged products.
Oodle Texture offers the fastest and highest-quality encoders for block-compressed BC1-BC7 textures, enabling them to be made up to two to three times smaller than using other encoders while retaining a high level of visual quality, saving memory and bandwidth.
Already used by Fortnite, Oodle Network Compression is a unique solution for real-time compression of network traffic, significantly reducing the minimum bandwidth required for multiplayer games.
Last but not least, Bink Video is the most popular video codec for video games. It is a performance-designed video codec, decoding up to 10 times faster than other codecs, and using 8 to 16 times less memory. It enables you to drop the required data rate while maintaining visual quality. The tools run on all Unreal Engine-supported platforms.
Leveraging the cloud
Pixel Streaming is now production-ready, with a host of quality improvements and an upgraded version of WebRTC. This powerful technology enables Unreal Engine and applications built on it to run on a high-powered cloud virtual machine and to deliver the full-quality experience to end users anywhere on a normal web browser on any device. In the release, we’ve also added support for Linux and the ability to run Pixel Streaming from a container environment.
This is just one example of the new support for containers on Windows and Linux, enabling Unreal Engine to act as a powerful self-contained foundational technology layer. Containers are packages of software that encompass all of the necessary elements to run in any environment, including the cloud.
This support paves the way for new cloud-based development workflows and deployment strategies, such as CI/CD, AI/ML training, batch processing and rendering, and microservices. Unreal Engine containers can be used to enhance production pipelines, develop next-generation cloud applications, deploy enterprise solutions at unprecedented scale, and much more.