Industry Insights: Where is augmented reality in broadcast headed?

By NewscastStudio

Subscribe to NewscastStudio's newsletter for the latest in broadcast design and engineering delivered to your inbox.

Augmented reality, immersive mixed reality and virtual sets continue to see increased adoption, moving beyond special events like election night and the Olympics. 

These tools continue to gain realism and ease of use with new workflows and integrations, allowing broadcasters of any size to leverage them. 

In part one of our Industry Insights roundtable, we speak with leaders from the motion graphics production solutions industry along with key design firms about augmented reality’s future, workflows and remote production implications. 

Where are we headed with augmented reality (AR) workflows in broadcast?

“As an industry, the groundwork has been laid for AR productions but now is the time to make this transformative technology accessible to all sizes of productions and budgets. Therefore, we are likely to see AR workflows move to the cloud, alongside many other elements of broadcast,” said Gerhard Lang, CTO for Vizrt.

“In the short term we will continue to see increased adoption in live event and broadcast production as the tools become more accessible to a large user base. In the mid to long term we will see these tools enable new types of media experiences that leverages the online and interactive nature of AR workflows,” said Marcus Blom Brodersen, the CEO of Pixotope.

“There are two ways we see this go. AR will be used as often as any 2D graphics the studios employ today for the general industry. This will be the standard. For those who already are in AR and virtual studio world, we’re headed towards a future where AR and mixed reality are becoming increasingly sophisticated and used in more creative ways than ever before,” answered Onur Gulenc, a territory manager for Zero Density.

Advertisement

“From a workflow perspective, the long ideation and creation stages are getting shorter and shorter whilst the hardware to deliver is getting less expensive, therefore it is becoming far more commonplace than it was even six years ago,” said Paul Jamieson, a VP of creative for AE Live.

“For a long time, companies, including Silver Spoon, were trying to figure out what kind of content could use AR purposefully. With applications like Maya, Cinema 4D, and Houdini significantly improving their integration in-engine, broadcasters are now using their favorite tools as part of their graphics workflow. Instead of finishing them in a traditional render pipeline, they can start to use Unreal Engine and do the same job in real-time,” said Dan Pack, managing director at Silver Spoon

“AR is now a given but expectations in terms of quality are also higher. Having engines like Unreal helps us produce stunning photorealistic effects but the key is to adapt Unreal to broadcast and ‘democratize’ the use of AR for broadcasters and media producers. As such, our focus has always been and will continue to be simplicity,” replied Boromy Ung, senior director of product management for Ross Video.

“The process for bringing AR on air is accelerating and becoming less cumbersome with the advent of cloud-based AR platforms. When segment producers and other non-designers are empowered to create AR graphics and presenters can control the graphics live, using a cell phone, it completely transforms the workflows,” said Yaron Zakai-Or, co-founder and CEO at Arti.

“Virtual and AR technology is now mature enough to represent a significant part of the content creation portfolio of most production companies, including broadcasters, due to the changes the digital age has driven on how content is produced and consumed. The integration of virtual production in broadcast automation is increasingly important, so the production equipment that creates such content are no longer islands and become part of a larger, integrated and collaborative workflow,” said Miguel Churruca, marketing and communications director for Brainstorm.

“Augmented reality has begun to morph into many different types of broadcast integration. As the quality of the software-based render engine’s capacity has increased and the hardware as well in the form of graphics cards and CPU’s, we have seen the ability for broadcasters to create set-dressing without the need for physical builds to achieve realistic integrations. This is now commonplace on shows like SportsCenter when they simulate set monitoring with AR elements. This has led the way for new terms like Mixed Reality and Extended Reality to take hold in our industry,” said Nathan Marsch, creative and managing director of Girraphic

What has changed in the past few years?

“In a word…Everything. Epic and the Unreal Engine has taken our industry by storm. As the talent pool of designers and technical artists increases in this software platform, we are seeing a shift in the photorealistic quality capable in broadcasts. The team at Epic has been busy integrating and partnering with broadcasters and creative companies and continues to lead the industry in research and development of new products for film, television, broadcast, interactive & experiential displays,” said Marsh.

“Over the past few years, real-time engines have become an integral part of studios’ pipelines, and are no longer just an add-on or afterthought. We started as a VFX studio, relying on traditional pipelines, but now, using Unreal is our first option. It allows us to iterate quicker, deliver content quicker, and it allows us to be more collaborative because we can see things all at once – real-time pipelines are now a necessary component in the creative industries,” said Pack.

“AR has become mainstream and is present in nearly all major broadcast events. The use of AR in the 2016 US elections was a milestone with all key networks utilizing AR elements as a storytelling tool. With the growing demand for AR, the solutions became cheaper and easier to set up. Game engines like Unreal became a more mainstream part of major productions as well. Tracking systems became smaller and easier to set up too,” Lang noted.

“From a technology perspective, the quality and availability of game engine technology, GPUs cloud infrastructure, LED technologies, video over IP and modern software development methodology has greatly increased the development and adoption of Virtual Production. Combined with the pandemic and heavy pressure on traditional broadcast to reinvent their content, has created a perfect storm for Virtual production growth,” Brodersen said.

“Hardware advances like Nvidia’s DLSS and ray-tracing have made CG elements more realistic than ever. The hardware is catching up with the software: With the right setup, you can now have hyperreal graphics on live broadcasts in UHD, all without any dropped frames,” said Gulenc.

“I think we’re now at the stage that it has become far more commonplace and not just from a budgetary or space perspective. Producers have embraced this tech and its almost limitless possibilities and there’s a genuine desire now from production and creative teams to push the boundaries whereas previously it may have been more the technology teams pushing this innovation,” Jamieson told us.

“The big thing that has happened in the past couple of years is the rise of virtual LED environments. Obviously, the big studios have been making extensive use of LED environments especially in the context of coronavirus as this allows them to reproduce a particular scenery without having to be physically on-site. But we see this now gaining a lot of traction with broadcasters as well, as this allows the talent to visualize the virtual environment they are supposed to be interacting with while still providing the same benefits as virtual sets particularly around production flexibility,” Ung said.

“The technology has been democratized—it’s easier to use, more affordable, and more flexible than ever. At the same time, audiences have raised their expectations for on-air graphics, since they’re used to seeing AR in their social apps, video games, etc. It’s really a perfect storm, or a convergence,” stated Zakai-Or.

Advertisement

“The pandemic has consolidated and accelerated the virtual technology revolution. Virtual and remote solutions are here to stay, so taking advantage of these will only help in satisfying an ever-growing demand for content. The traditional way of doing business in television has been seriously impacted by the pandemic with production dropping all around the world, travel restrictions applied, and many other related issues. However, the situation also rose opportunities for any kind of virtual production, from remote shooting to virtual events,” Churruca said.

“The industries of live events, trade shows, concerts, sport, and election broadcasts have all begun to blend together as the creative minds in our industry continue to push the envelope and shatter barriers. Couple that with the digital and NDI revolutions and we are looking at multiple content delivery platforms working in unison and technology, software and hardware that is interchangeable. Onward to the metaverse we charge. Look no further than Facebook changing their company name to Meta to lend credence to that fast-approaching reality. Don’t get me started on the potential use cases for NFTs,” added Marsh.

How has remote production (largely driven by the pandemic) changed your thinking on AR or virtual?

“Remote access through the different solutions has become so efficient that the work experience doesn’t differ much whether you are on-site or logged in remotely. Assisting customers during setup remotely is also easier. For monitoring purposes, the on-prem signal can be converted to an IP format which can be transmitted with low latency and exceptional quality. Using NDI 5 is great for all types of monitoring, with the quality allowing productions to even do quality control assessments on the keying. We’ve also been able to utilize AR technology in remote productions to teleport interview partners into studios,” said Lang.

“We’re seeing a clear trend to centralize production and the need to rationalize the operations of the production centers, leading to adoption of video over IP and virtualized infrastructure. Virtual production, being virtualized by nature, lends itself especially well to being deployed in this type of environment, and offers the possibility to be fully automated and connected to remote camera operations,” explained Brodersen.

“Broadcasters were already starting to use AR and virtual graphics before the pandemic: the coronavirus just sped everything up. The pandemic has democratized AR and virtual graphics, making them more accessible throughout the industry. Broadcasters can now even control all their virtual studio graphics directly from an iPhone,” said Gulenc.

“We were fortunate in that we had been working in a remote production environment in other regions for some time ahead of the pandemic. This served us well as our applications were optimized to work across a remote workflow. For example we regularly reproduce multiple in-game AR feeds downstream by taking time accurate camera data back to the remote hub,” Jamieson said.

“REMI workflows, referring to a remote integration model, allow our machines to stay on-site while our operators are sitting back in New York. AR teams are highly specialized, so instead of sending four groups made up of an operator, technician, and Unreal Engine artist to multiple locations in the country, it’s possible to have everybody in a central location. There’s next to no latency because the on-site machines are doing the rendering – we can actually be more efficient because we don’t need to travel, completing projects back-to-back,” Pack noted.

“AR is just one element of the entire production chain so to a certain extent, working in a remote production context is no different than doing the same with the rest of the production chain. Remote control and monitoring capabilities are supported as they are with the rest of the Ross ecosystem,” said Ung.

“The impressive quick pivot to remote production really confirmed what we had been seeing before the pandemic—namely that broadcasters want and need the flexibility to produce high-quality AR both inside and outside the studio. They no longer want to be tethered to a server room or solely reliant on designers with specific AR expertise. They’re used to more nimble workflows now with cloud production, and they aren’t looking back,” Zakai-Or said.

“I wouldn’t say that the pandemic itself has changed perspective on virtual usage cases for us as a company as much as it has for production in general across a multitude of industries,” said Marsh. “Virtual production tools and Volumetric LED stages has gone absolutely crazy, and everyone is racing to get ahead of the curve from Lux Machina to MGM and their construction of The Sphere amongst many other companies… No longer do producers and directors have to deal with outside elements to create beautiful cinematic shots and the immediate gratification of virtual production gives way to what you see is what you get realm for high budget shoots. When you can look at the composited vision immediately on set and in-camera and manipulate the environment, lighting, actors and set elements in a controlled manner without having to restage a physical shoot it becomes clear that this is the only way forward.”

“Our users have been creating advanced AR and virtual content for decades, and the global increase in the quantity and quality of AR and virtual content produced by content creators of any kind confirms that our vision was correct and encourages our company to continue delivering the most advanced technologies to comply with any client’s requirements, no matter how complex they may be,” Churruca responded.

“As far as remote production and the pandemic in relation to our company, what it did for us was give us a locked-down sandbox for a season of sport in the ISL where we were allowed to hone in our implementation without the logistics of setup and teardown after each event. At the ISL event in Budapest, we were quarantined between a hotel and the venue for 9 weeks with days off in between events with a fully rigged event space. When we coupled that with the boredom of being locked down what we saw was an acceleration of ideas and technical integrations surrounding the use of augmented reality as a bridge between races and in the broadcast in general. This was far and away the most rapidly advancing broadcast integration we ever had the chance of working on as a company,” added Marsh.

Participants

Gerhard Lang, Vizrt
Marcus Blom Brodersen, Pixotope
Onur Gulenc, Zero Density
Paul Jamieson, AE Live
Dan Pack, Silver Spoon
Boromy Ung, Ross Video
Yaron Zakai-Or, Arti
Miguel Churruca, Brainstorm
Nathan Marsh, Girraphic

Sign up for NewscastStudio's weekly newsletter.Our weekly newsletter delivers the latest broadcast industry news to your inbox including new debuts, case studies, thought leadership and broadcast gear updates.