BlogUnreal Engine V. 4.27

Unreal Engine V. 4.27

On August 19, 2021, the latest version of the Unreal Engine 4.27 was released by Epic Games. As the last major version before the full release of Unreal Engine 5, which is already available in Early Access, this version brings several special features with it. The focus is clearly on the expansion and improvement of already existing functionalities and tools for the most diverse areas of use. But also some very interesting innovations go along with this release.

Quality of Life:

First of all, there are some Quality of Life improvements, some more present, others not directly noticeable. First of all, there is a better overview of the processes and resource usage of the software.
Until now, when starting a project, one received a small percentage display and very little information about what was actually happening behind the scenes. So it could be that with larger and more complex projects you sat for several minutes in front of a loading screen that could tell you very little about the actual loading process.
In version 4.27, we finally get more and clearer information about what is happening.

A nice little indicator at the top right of the screen also provides information about the status of the Derived Data Cache of the current project.

Virtual Production:

Of course, there are also some innovations in the area of virtual production. Since this area is increasingly becoming the hobbyhorse for the Unreal Engine, the increased focus on it in terms of advanced functionalities is not surprising.
For multi-screen applications, the visual nDisplay configurator is available. There, you can arrange displays in 3D space in a kind of “what-you-see-is-what-you-get” manner in order to optimally adapt them to the actual setup in relation to each other, perspective and tilt.

The functionality of “mGPU” has been expanded, support for multiple cameras and more efficient hardware utilization has been added. With “nDisplay Overscan” post-processing effects like bllom, ambient occlusion and motion blur will be used with multiple render nodes. Where there was previously a visible visual cut, the different image sources can now be better stitched together.

A first experimental implementation of nDisplay support on Linux devices and support for “OpenColorIO” for more efficient color calibration has also been added in 4.27.
Another major update has been given to the so-called “Remote Control Workflows”. Using an Internet browser on an end device in the same network, simple controls in the form of buttons, sliders, color wheels, etc. can be used to make live changes to the current Unreal Engine scene. A “UI-Builder” allows the configuration of a control surface on the web page generated from the scene. There one can add and arrange for it released controls, in order to build itself so an effective user interface adapted to the current work situation. Furthermore, live settings can also be changed on the scene via known protocols such as DMX, OSC and MIDI. A C++ API has also been prepared for this, in order to make controls via external desktop applications.
The new “Level Snapshots” function (currently still in beta) makes it possible to save the status of a scene with all the objects in it and their settings and recall them at will. This allows for a freer, more efficient and faster iterative work process when setting up a scene for lighting, object placement and more.
The virtual camera system also received other enhancements and is now ready for production. Support for multiple users simultaneously, connectivity to the internal compositing system “Composure”, use of custom UMG controls, are just some of the additional features users can look forward to.
The “Virtual Camera iOS App” has been completely redesigned and is now more focused on virtual camera work than the standard “Unreal Remote App”.

The use of the popular “Universal Scene Description” (USD) file format has been expanded with a strong focus on additional export options. As one of the most popular and widely used file formats in the media and entertainment industry, there are now some additional functionalities that make working with it easier and improve it enormously. Exporting levels and sublevels, as well as the objects within them, as separate, mutually referencing USD files, and “baking” textures and materials prior to export are just two of several improvements.
Working with Alembic (short .abc) files has also been improved. Especially in terms of hair and fur simulations, the use of the file format became more common. Better support for Alembic Cache was also added.

Datasmith

Besides some improvements for already existing export plugins for programs like Archicad, Sketchup, Rhino, Revit, 3DS Max and Navisworks, an experimental export plugin for Solidworks is also available for the first time.
The “Visual Dataprep Workflow” for procedural preparation of Datasmith imports in a module-based environment has also been improved to include new filter options, operators and component support.
The biggest and most far-reaching change in the Datasmith area is the ability to import Datasmith files at runtime. Thus, the updated collaboration template already offers the option to import Datasmith files during a multi-user session and view, discuss and edit them together. This enables completely new usage scenarios for Unreal Engine applications that do not require a project to be recustomized and recompiled for each new Datasmith object.

A new camera calibration tool has been introduced to better transfer lens distortions from real cameras to Unreal Engines Cinematic Cameras.

In the area of Live Link, the real-time motion linking protocol of the Unreal Engine, there have also been some changes. The “FreeD” data protocol is now supported in beta status, which is mainly used for camera tracking of Pan, Tilt, Zoom (PTZ) cameras.
On the other hand, the additional VRPN plugin, which also appears in beta status, enables better Live Link connectivity for virtual reality and virtual production.
The Live Link Face app for iOS (for iOS devices with Face ID) has enabled an intuitive process for face tracking since its release. For example, Epic Games’ “Metahumans” are also prepared in such a way that you simply have to connect your iOS device and then you can control the facial expressions of the 3D character with your own facial movements.
In the new version of the app, a calibration mode is added, which adjusts the fitting of the transmitted data to the particular person and their facial features. Adapting the app to be more usable on iPads is also a welcome new feature.

Sequencer

The Sequencer tool now supports frame-by-frame synchronization of videos. Previously, this was only possible with image sequences. Now, however, videos are also synchronized with the timeline by the Sequencer.
The “Movie Render Queue” is now directly accessible via a button from the Sequencer and now also supports overscan and 32-bit color depth for all renders. This is an incredibly important new feature for film and media creators who want to reuse image and video material exported from the Unreal Engine in common video editing or image composition programs.
The user interface of the sequencer has been further revised and improved for the purpose of user-friendliness.
A new type of track has also been added to the Sequencer, allowing events to be created in conjunction with the Unreal Engine’s Gameplay Ability System.

Pixel Streaming

The “Pixel Streaming” feature released in version 4.21 of the Unreal Engine is now ready for production with the current 4.27. “Pixel Streaming” allows interactive content of an Unreal Engine application, which is computed on a powerful desktop PC for example, to be called up via web browser on any end device (smartphones, tablets, weaker notebooks and PCs). The advantage is that the end device does not have to meet the high performance requirements of demanding applications and can still interact with the content as if it were running locally. A kind of interactive video stream, so to speak. With “Pixel Streaming” ready for production, one can expect a variety of novel use cases for Unreal Engine projects. In addition, support for Linux as server instances, the AMD encoder (AMD Advance Media Framework”, as well as basic stability and quality improvements through the use of newer WebRTC technology have also been implemented. This also added the ability to tap into users’ microphones in the browser and use them in the Unreal Engine application.

Rendering

There have also been some changes in the area of rendering. The pre-calculation of light and shadow information via the graphics card (instead of via the processor), “GPU Lightmass” has received some additional functions and now calculates a lot more accurately, realistically and, depending on the application, even faster. Between the “old” approach of calculating everything with the processor, which usually took quite a long time due to the limited number of cores or threads, “GPU Lightmass” allows a much more comfortable, iterative and faster workflow.
The “Path Tracer” (currently still in beta) takes a completely different approach than all other functions of the Unreal Engine, which are mainly designed for real-time image calculation, and works similarly to a classic render pipeline. With NVIDIA raytracing capable graphics cards and DirectX 12, images can be generated that contain a fully comprehensive and physically correct light and shadow calculation, correct reflections and refractions. By means of sample numbers, number of light calculations and a built-in noise removal, which can be controlled via the post-process volume, the highest degree of realism can be achieved. At the expense of real-time calculation, of course.
With the introduction of the “MetaHuman Creator” in version 4.26, many improvements in the area of hair and fur rendering were introduced for version 4.27 to make them even more realistic.

VR / AR / XR

OpenXR is finally ready for production in version 4.27 and enables the simpler programming of VR, AR and XR applications for various platforms through a comprehensive standard. In addition, there is a completely revised VR template project, which is specially set up with OpenXR and optimized for it.
The augmented reality template has also been reworked and now brings more features to build on. So you can no longer just place objects, but also move, rotate and scale them. A basic user interface and the possibility to save snapshots are also included. The template automatically works with both popular AR platforms ARCore (Android) and ARKit (iOS).

Niagara

The procedural and module- as well as node-based tool for particle simulations “Niagara” has also been given a slew of new functions.
Modules, functions and dynamic inputs can now be versioned. This allows you to save settings and changes to modules, for example, and then try out several versions in an iterative process and use the best one.

When creating a new “Niagara” system, additional examples of particle behavior are now available. On the basis of these prefabricated particle systems, one can explore the functionality on the one hand and also build one’s own complex systems.
The new “Niagara Debugger” gives a better and clearer overview of performance and simulation status of particle systems.
Arrays of 3D objects can now be used for particle systems with 3D objects as particles. Thus, a particle system can use several different 3D objects as particles and display them either randomly or in order.

Conclusion

The Unreal Engine is constantly evolving in so many different directions that you can hardly call it just a “game engine” anymore. So many other application scenarios are possible and are also obviously supported and promoted by Epic Games. Especially developer-friendly features, like the clearer display of which processes are running in the background when loading a project, show that the focus on user-friendliness should never be lost with all the functionality.
For a comprehensive overview of all new features, we can only recommend the release notes of the Unreal Engine:

https://docs.unrealengine.com/4.27/en-US/WhatsNew/Builds/ReleaseNotes/4_27/

Kontakt

Dein INCAS Team
Akkordion öffnen
telephone-icon-contact-coaching-box
0800 4772466
email-icon-contact-coaching-box
info@incas-training.de

"*" indicates required fields

Hidden
This field is for validation purposes and should be left unchanged.

Trainings that might interest you

Ratings

No matching customer testimonials available.