Industry Insights: Extended reality, virtual production and the rise of Unreal Engine

By NewscastStudio

Augmented reality, immersive mixed reality and virtual sets continue to see increased adoption, moving beyond special events like election night and the Olympics. 

In part two of our Industry Insights roundtable, we speak with leaders from the motion graphics production solutions industry along with key design firms about new advancements in technology and the rise of immersive mixed reality.

We also look at the Unreal Engine and its dominance in rendering most of the industry’s virtual worlds. 

What advancements are we seeing in the world of augmented reality, virtual sets and immersive mixed reality?

“When we begin to mix chromakeying, augmented graphics, LED screens to create depth monitoring (extended reality) and couple that with interactivity for the audience via the digital platforms and the hosts of a broadcast in studio to control the content we are witnessing the gamification of reality and the way we consume our media,” said Nathan Marsh, creative and managing director of Girraphic

“While game engines such as Unreal Engine brought in a significant leap forward in terms of output quality, broadcast graphics workflows also have specific requirements, like database connections, statistics, tickers, social media or lower thirds, a variety of elements that are alien to the game engine framework but essential for not only for broadcast operation, but for creating attractive AR content,” said Miguel Churruca, marketing and communications director for Brainstorm.

“We are seeing tons of advancements in extended reality. At Ross Video, we define this as LED production where the content on the LED is rendered in a perspective view with camera tracking data, while the surrounding non-LED display area is expanded seamlessly with augmented reality. In other words, you end up with a hybrid environment where you get all of the benefits of LED environments (e.g., having the talent ‘see’ the environment) while retaining the flexibility that comes with virtual objects and keeping the physical footprint reasonable,” answered Boromy Ung, senior director of product management for Ross Video.

Advertisement

“The goal is to create the same quality as a high-end, traditional VFX pipeline but in a true real-time setting. In the past year, we’ve generated real-time particle simulation, real-time crowd simulation, and real-time VFX in fluid simulations. With AR, we’re seeing in-camera results in real-time that would normally be very tedious and time-intensive. This saves an incredible amount of time and expense while maintaining the same level of production value,” said Dan Pack, managing director at Silver Spoon

“The next frontier will be to expand outside of the studio confines regularly for remote on-site production. The setup, hardware and technical limitations were in many cases a big barrier for many but as the hardware gets cheaper, more portable, setup is quicker and more reliable, we’ll see much more usage in these spaces,” said Paul Jamieson, a VP of creative for AE Live.

“Traditionally, virtual production has often been too difficult and risky for most productions, but modern virtual production systems provide greatly improved reliability and workflows which lowers the threshold of entry. We will continue to see great advancements in ease of use and accessibility that will continue to allow more content creators to use these tools,” said Marcus Blom Brodersen, the CEO of Pixotope.

“What we are seeing with our customers is that they are attempting to blur the lines between real and computer-generated content and footage. Photorealism at a robust, high quality is expected and the amount of work going into designs has increased tenfold. Therefore, the role designers are playing is gaining in importance with the realization that while photorealism is achievable, to make a convincing AR show you need to have the right talent,” Gerhard Lang, CTO for Vizrt, pointed out.

What about immersive mixed reality, what advancements are we seeing on that front specifically?

“There have been some incredible advancements in computing render power—what we’re able to do now with NVIDIA’s latest graphics cards is so far beyond even the year before, achieving very realistic shading and lighting. There’s been a mass convergence of tech with multiple applications being used as one. For example, it’s possible to combine LED with projection mapping, camera tracking, and motion capture in real-time, allowing us to achieve visuals that were not possible a year ago,” Pack said.

“The previous usage of chroma keying for inserting characters into the scene is now complemented with XR technology and videowalls, which have obvious advantages such as not needing chroma keying but also have their own drawbacks, and some issues that need to be taken care of. However, in essence, using a projected background or a keyed character is essentially the same, just using a different approach to achieve a given result, and it is up to the user to decide which can serve better for their purposes or requirements,” Churruca said.

“Mixed reality is one of the applications that would fall under XR. So like I said, we XR gaining a lot of traction in the media & entertainment space because of the reasons we mentioned earlier. There has been a lot of advances in terms of being able to synchronize multiple rendering engines and managing delays through the system. As an example, in the XR world, because of the inherent delays due to LED wall processors, the ‘virtual’ cameras may take longer to switch than the real ones which may cause some synchronization issues. These are key technical challenges that we have managed to overcome and that will help XR become more widely adopted,” said Ung.

“We as a company are exploring this space more on the event side and less on the broadcast side. It’s hard to immerse an audience physically when they are staring at a screen. We have undertaken several new event-based projects that are quite fresh in their perspective. We are applying our experiences in broadcast to the live event space, projection mapping, LED Screens, interactivity, and audience engagement to control surrounding content,” said Marsh.

“In the short to mid term, we will see new tools emerge that leverage computer vision and machine learning technology for making the integration of video and graphics easier and less reliant upon on-site hardware equipment. Camera and object/person tracking, foreground/ background segmentation and object recognition, all will see advancements that will benefit all types of virtual production,” said Blom Brodersen.

How many of those advancements are driven by Unreal Engine?

“There’s no questioning the importance of the Unreal Engine, but it’s not the only driving force behind these advancements—it’s mostly down to hunger for innovation. The conditions brought about by the pandemic certainly played a huge role, but it’s ultimately driven by people’s never-ending search to create something unique and innovative. Along with advancements in real-time engines, many other technologies have all caught up at a similar time. For instance, motion capture and VR are really coming into their own now, despite being available for a long period of time prior,” said Pack.

“Unreal Engine has set such a high standard for real-time rendering quality that it’s made other technologies obsolete. It’s become a prerequisite for broadcasters creating virtual studio and AR productions that require photoreal visuals,” said Onur Gulenc, a territory manager for Zero Density.

“It’s Unreal that helped raise awareness that photorealism is possible. For many TV stations, this was a revelation, and it motivated them to push for higher quality content. But as I said before, in the end, it is the creative team that makes or breaks a show. Unreal also brings physics to the table for free. The ability to manipulate physics in a virtual setting is particularly useful in making realistic organic worlds as the interaction between objects, wind and autonomous movement are essential for scenes like this,” Lang replied.

“From our unique perspective, all of them. We have founded a new company Girraphic X Lab that is solely dedicated to Unreal project development. It is headed up by Ray Kristiano, who has over 15 years of experience in delivering high end immersive and interactive event experiences. Take a look at the Dubai Expo Dome this year and you will see the depth of his creative prowess on display. The expansive nature of the Unreal platform and its capabilities are truly overwhelming. The lack of experienced users is even more prevalent across the industry as employers scramble for talent that is capable of utilizing the software efficiently. We have partnered with Epic to try to further their cause and also develop internal talent to keep up with the demand,” Marsh disclosed.

“Unreal Engine has provided a quantum leap in hyper-realism for real-time broadcast and film content. Being able to provide such rendering quality has been a game-changer for broadcasters and other content providers, but we should not forget it is a game engine that needs to be complemented with other technologies to comply with content creation requirements. Elements such as broadcast hardware connectivity, camera tracking, motion graphics and text creation, and many others need to be added on top of the Unreal Engine to provide the content broadcast requires. All these elements are alien to the game engine, meaning it is not nearly as efficient in creating them,” Churruca said.

Advertisement

“Unreal Engine is critical for these enhancements, as it drives the high-end graphics behind them, which enables the platform-agnostic AR creation,” said Yaron Zakai-Or of Arti.

“Epic Games have technologies to help the industry embrace XR. One of them is called nDisplay which eases the process of deploying and launching multiple instances of a project across multiple rendering engines, with each of them rendering to one or multiple LED displays,” Ung noted.

“In Unreal Engine 5, Epic Games provides an unrivaled graphical quality and performance, combined with an expanding set of tools that empowers graphic artists. We have every reason to believe they will continue to push the boundaries of what is possible to do with real time graphics in the years to come,” Blom Brodersen answered.

Is the broadcast industry too reliant on Unreal Engine?

“Actually, we feel the broadcast industry isn’t relying on Unreal Engine enough! No other technology is as open or has as large a community as the Unreal Engine user base. Broadcasters should take the opportunity to tap into this vast talent pool to create more stunning AR visuals,” said Gulenc. 

“Unreal is rapidly breaking ground in the broadcast industry but it is not yet positioned to unseat the major players like Vizrt, Ross, Chyron, Brainstorm, etc. However, you can see the industry-wide adaptation as all of these established companies are racing to partner with Epic to offer integrations that the render engine and the creative platform of Unreal is a force to be reckoned with. The company has expanded so rapidly hiring so many of my colleagues in broadcast and is endeavoring to implement processes and solutions to cater to broadcast clients that it is only a matter of time before they begin to offer more comprehensive broadcast solutions and products,” said Marsh.

“I would not say the ‘broadcast industry’ in general, although it is true that some players that use only UE for their rendering are, of course, fully dependent on it for their operation. We have seen a growing number of companies providing solutions fully based on Unreal Engine, which provides great results but at the same time are also restricted to whatever Unreal can provide, and we have seen that, despite the great advantages this render engine provides, it is not a complete solution to cope with all the requirements of broadcast content creation. However, other manufacturers support a combined approach by using Unreal and other rendering technologies such as PBR or even alternative render engines,” elaborated Churruca.

“That’s a bit of a loaded question. Unreal Engine has become ubiquitous because of its amazing rendering capabilities, and what’s more, because Epic Games has proven itself a great partner to and supporter of the creator community. Unless a challenger emerges that can exponentially leapfrog their technical capabilities, I don’t see the industry moving away from Unreal,” said Zakai-Or.

“We don’t think the industry is too reliant on Unreal. While we would not be there in terms of rendering capabilities without Unreal, there are other engines in the market that would have taken this space if it wasn’t Unreal. Plus most of the advancements that Unreal brings to the table do help the industry create more compelling content so while there is some form of ‘reliance’ indeed, it is certainly not a bad thing,” Ung said.

“I think it’s really important that people realize that the use of real-time is intentional. It’s not about gimmicks and is therefore not reliant entirely on Unreal alone. The greatest VFX examples are those that are not overtly seen and help enhance the story – AR and XR are meant to augment all of that, but like good VFX, it should still be invisible. So it’s less a case of being too reliant, and more an example of effective storytelling,” asserted Pack.

“As in any industry it’s always good to have a selection of suppliers/products available but that doesn’t really exist in this area right now and Unreal does tick all the boxes—there is the adage, if it’s not broke don’t fix it. I will say, this relatively new technology is still in its infancy stage, and we’ve not reached mass usage, nor do I think we have the wide diverse pool of Unreal creative design resources. As an industry of broadcasters and suppliers, it is up to us to work together to build this resource and work closely to maximize its full potential,” Jamieson said.

“We are firm believers in using the right tools for the job, and currently Unreal Engine provides a great platform for building high end real time 3D graphics,” agreed Blom Brodersen.

” Unreal is an integral part of our ecosystem and we recognize its importance for the broadcast industry. However, at least with our customers, we don’t see a strict dependency on Unreal but rather an addition to our offerings without creating a long-lasting reliance,” said Lang. “Unreal is a render engine. Modeling usually takes part in programs like Maya, Cinema 4D or Blender. The models can be imported into whichever real-time photorealistic renderer the customer chooses. For example, our control applications can talk to either Unreal or Viz Engine and are aware of the capabilities and limitations of either. When it comes to camera tracking, we are not only tracking system, but also render engine agnostic.”

Participants

Nathan Marsh, Girraphic
Miguel Churruca, Brainstorm
Yaron Zakai-Or, Arti
Boromy Ung, Ross Video
Dan Pack, Silver Spoon
Paul Jamieson, AE Live
Onur Gulenc, Zero Density
Marcus Blom Brodersen, Pixotope
Gerhard Lang, Vizrt