Industry Insights: Virtual production professionals talk challenges, advancements
Subscribe to NewscastStudio for the latest news, project case studies and product announcements in broadcast technology, creative design and engineering delivered to your inbox.
As broadcasters increasingly turn to augmented reality (AR) and virtual production, integrating these technologies presents significant challenges.
This roundtable discussion delves into the technical intricacies broadcast professionals face, examining the hurdles and solutions in implementing AR and virtual production.
From latency and camera tracking to the integration of new workflows and skillsets, our experts provide a comprehensive look at the evolving landscape of broadcast technology. Join us for part two of this Industry Insights series, where we explore the practical realities and future potential of AR and virtual production in broadcasting.
Key takeaways from the Industry Insights roundtable
- Latency and Synchronization: Effective implementation of AR and virtual production requires precise synchronization of all elements to avoid breaking the illusion for the audience.
- Camera Tracking: Accurate camera tracking is crucial, especially in multi-camera environments, requiring extensive knowledge and alignment of tracking systems.
- Technological Integration: AR and virtual production necessitate the integration of advanced technologies like camera tracking and render engines, along with a shift in workflow and mindset.
- Skillset Evolution: New roles and technical skills are essential, including virtual studio operators and AR graphics operators, necessitating ongoing training and adaptation.
- Accessibility for Smaller Broadcasters: Solutions like cloud services and generative AI can help smaller broadcasters leverage these technologies cost-effectively, promoting wider adoption and innovation.
What are the technical challenges in implementing augmented reality and virtual production in broadcasting?
Adam Callaway, global lead of virtual production and broadcast, Brompton Technology: Latency and genlock are both key challenges when implementing virtual production and AR elements within a broadcast environment. All elements of the production must tie together cleanly. If they don’t, then the illusion will be broken — the audience will be focused on perceived technical issues, rather than the production itself.
Chris Izatt, director of innovation and virtual, AE Live: Implementing highly accurate camera tracking is imperative to any production that utilizes virtual graphics, and each use-case scenario presents its own unique challenges. Real-time render engines are becoming ever more complex and therefore require extensive knowledge to ensure they are configured according to the specifications that their function requires. In a multi-camera environment, combining various camera tracking systems and ensuring they are accurately aligned can prove challenging.
Robert DeFranco, CEO, Sequin AR: AR and virtual production require the integration of new technologies such as camera tracking and render engines and new workflows into existing environments. This also requires a mindset change, but these challenges are not insurmountable and many companies can help ease that transition. I suggest having a specific reason or project that requires VP. This will motivate the team to solve those problems and have early success.
How do these technologies impact the roles and skillsets required within broadcast production teams?
Adam Callaway: Virtual production and AR both have highly technical workflows. New skills and new teams need to be formed within production to oversee the deployment and operation of the Unreal scenes, LED, tracking, and much more. These skills are in short supply, so training in all of these areas is becoming increasingly important to allow the industry to keep up with demand.
Martin Klampferer, product owner and R&D manager, Vizrt: Broadcast production teams need to have the technical skills to set up and calibrate the tracking and cameras used; they are confronted with traditional SDI installations or modern streaming infrastructures. Technical designers are often required to create the graphics used in real time, while technical directors are required to combine the different solutions into a compelling story. Programming or scripting skills can be useful for interactivity or specific data-handling applications.
Chris Izatt: New roles are required such as the virtual studio operator, AR graphics operator, and virtual technician. Camera tracking systems must be maintained and troubleshooted as required. Changes to lighting and camera settings will affect the virtual compositing pipeline and therefore must be well understood to avoid degrading the final output.
Xuan Seifert, VP of digital content, Sequin AR: These technologies brought in a new convergence between traditional broadcast experience and real-time interactive developments. It offers many new opportunities for technical artists from different backgrounds to cross into the virtual production and broadcast territory. New and dynamic ways to visualize data can be driven by the same techniques used for video game development.
Marcus Brodersen, CEO, Pixotope: Right now there is a talent gap, with the demand for these skills being high but not enough trained professionals to meet that demand. Part of the problem is now being solved by the products and technology becoming more mature and mainstream, but it is also important to put the necessary structures in place for the next generation of professionals (like the Pixotope Education Program). Additionally, by developing tools that integrate seamlessly into existing workflows, we can encourage adoption and foster collaboration between traditional broadcasting roles and new tech-focused positions, leading to a more interdisciplinary and innovative production environment.
What are the most significant advancements in graphics technology in recent years?
Martin Klampferer: Possible resolutions have increased and productions in UHD HDR have become a reality. Cloud-based productions have become a popular option, which allows remote workflows, scalable productions and reduces setup time. XR solutions are used to extend video walls virtually and immerse the viewer in combination with AR graphics.
Ofir Benovici, CEO, Zero Density: Generative AI is definitely getting ready to make an impact in the broadcasting scene as it does in almost every industry by facilitating virtual set asset creation which is one of the areas that requires the highest cost. Generative AI will significantly decrease the investment necessary for virtual production, therefore making it more available for creative minds.
Patrick Twomey, director of graphics product management, Ross Video: I think the explosive growth in processing power of GPUs is the most significant advancement — it has given us incredible rendering capacity. These improvements have enabled us to develop real-time AI-driven animations and intricate 3D models that respond instantly to data or user interactions. Graphics that used to take days for large computers to process can be created and rendered in near real-time, enabling designers to work more closely to their vision and stay true to the brand with a fraction of the work, time, and expertise required in the past.
How can smaller broadcasters with limited budgets leverage these technologies effectively?
Martin Klampferer: Cloud services and flexible licensing models can reduce the need for expensive hardware and limit software investments for the required time. With more powerful machines, it is also possible to run multiple components on one system. The prices of tracked PTZ cameras are decreasing, and that combination allows for cost-effective solutions like Viz Virtual Studio Go, a compact offering of AR graphics and virtual sets, arriving pre-configured and ready to go.
Robert DeFranco: Get started. Pick a very specific smaller activation that you can’t do any other way and demonstrate a win for your company. You can limit the project to save money like using a locked off camera, one lens, and a limited number of assets. You can grow from there.
Marcus Brodersen: One of our core beliefs at Pixotope is that virtual production should be accessible to productions of all sizes and budgets, but the current inefficiencies and high costs of implementing virtual production leave smaller broadcasters at a disadvantage. Instead, they need to prioritize solution offerings that don’t require special technical teams, extensive learning curves and high costs but instead easily integrate with existing infrastructure for a seamless deployment, rapid onboarding and new creative capabilities. Virtual production can help studios create high-quality content with smaller crews or reduce the need for expensive physical studio setups, especially as it allows for remote production workflows, further mitigating the logistical and financial challenges associated with on-location productions.
Ofir Benovici: Generative AI offers a great opportunity for broadcasters as it makes virtual set creation, which is one of the costly items, more accessible and helps them to utilize virtual studios by paving the way for using different set designs in the same studio for different programs. New solutions based on Unreal Motion Design enhance rendering and compositing quality and significantly cut down production costs by allowing the same production assets to be used in video wall content, on-air graphics, pre-production motion graphics and virtual production graphics. Also, the learning paths the solution providers offer create another avenue for broadcasters to maximize the benefits they gain from virtual production solutions.
Miguel Churruca, marketing and communications director, Brainstorm: Virtual production and AR can be accessible to any content creator, as manufacturers have made great efforts to democratize the technology. In fact, solutions like InfinitySet are flexible enough to produce content for any kind and size content creators, in chroma sets or LED walls, and solving from large blockbuster’s needs to simple virtual environments with fixed cameras. Adopting these technologies may not be straight forward as they require some expertise to make the best out of them, but the benefits of this adoption are certainly worthwhile.
What are your predictions for the use of these technologies in broadcast storytelling?
Adam Callaway: As audiences become more accustomed to more complex productions, the use of virtual production will only increase, especially as these workflows become more affordable and easier to use within broadcast environments.
Martin Klampferer: The whole production chain will see several enhancements and simplifications through AI tools. Content will no longer be tailored only to certain regions; it will be personalized and based on the user’s preferences and behavior. Streaming will help with this trend, but the amount of client-side rendering will increase to ensure the best experience.
Marcus Brodersen: We anticipate virtual production evolving from an isolated component to an integral aspect of the media production workflow. The number one “future technology” on everyone’s mind at the recent NAB Show was AI — and we’ve certainly seen considerable mainstream adoption in the last year. In the context of virtual production, we can expect AI to automate some of the tedious manual processes that can slow productions down, providing more efficient workflows that enable greater creative ideas and outcomes.
Patrick Twomey: We’re seeing a lot of experimentation with how graphics and virtual elements can be used for immersive virtual experiences through technology like Meta’s and Apple’s headsets. We’ve already seen professional sports leagues like the NBA put a 3D camera down in the stadium so people can get the in-stadium experience from their couch at home. Then we have technology like HTML5 and client-side rendering allowing broadcasters to expand their offerings, open up niche sports like Ultimate Frisbee or Pickleball without the expense of a full production, and deliver more personalized advertising and graphics — such as highly localized weather data — which drives a better experience for audiences, broadcasters, and advertisers.
Thoughts on Unreal Engine and the launch of Project Avalanche?
Xuan Seifert: Combining three or four different graphic softwares into a single application within Unreal 5 will make the process very streamlined. Being able to just use all the best UE5 features, such as advanced lighting, Niagara particles, 3D environment, camera sequences and dynamic controls are also huge benefits. Project Avalanche will be the prime option for any studio that’s already using UE 5 for virtual production.
Marcus Brodersen: Unreal Engine has been a game-changer in virtual production, and for the first time Epic is tailoring its development specifically for 2D designers with Unreal Motion Design. This tool was designed with ease of use and quick adaption at the forefront and as a result, opens the door and marks a significant expansion for a whole new market to be able to use Unreal as the standard creative platform. This innovation positions Unreal as a single, versatile platform for broadcasters, enabling a unified creative process for virtual studios, augmented and extended reality, and now motion graphics, as well as streamlining operations by allowing asset sharing and reuse for all different types of content, which will have a significant impact on the broadcast and CG markets.
Patrick Twomey: The customers I’ve talked to are enamoured with it, but almost everyone agreed it’s probably a couple of years from where it needs to be. It’s interesting because it’s a similar moment to where we were with OpenGL and NVIDIA cards several years ago, essentially commoditizing 3D graphics and rendering work. Right now, there’s a relatively small group of people who work with broadcast specific graphic design tools, but Unreal Engine and Avalanche have brought the capability to design your own video game to thousands and thousands of people everywhere, and I think that is going to bring some exciting developments in the coming years.
Miguel Churruca: Brainstorm has been partnering with Epic Games and Unreal Engine since 2017, which, coupled with our decades of experience in graphics, makes us quite confident of what we can achieve when working with new tools. We strongly believe that Brainstorm’s expertise in graphics creation and management, including advanced transition logic, CG and interaction with broadcast workflows will seamlessly combine with Avalanche to provide second-to-none results, with the ease of use and comprehensive operation Brainstorm provides to its users.
Subscribe to NewscastStudio for the latest news, project case studies and product announcements in broadcast technology, creative design and engineering delivered to your inbox.
tags
Adam Callaway, AE Live, Brainstorm 3D, Brompton Technology, Chris Izatt, Marcus Brodersen, Miguel Churruca, Ofir Benovici, Patrick Twomey, Pixotope, Robert DeFranco, Ross Video, Sequin AR, Vizrt, Xuan Seifert, Zero Density
categories
Augmented Reality, Virtual Production and Virtual Sets, Featured, Industry Insights, Voices