Column: What is virtual production?

By Ben Davenport, Pixotope

Featured Column

View more columns from NewscastStudio contributors

In the first of a series of columns from Pixotope’s Ben Davenport for NewscastStudio, Ben examines the terminology of virtual production.

Virtual production is becoming a bit of a buzzword. Fueled in part by the pandemic, there has been a significant growth in the number of virtual production facilities being built and the number of productions using virtual production technology. But what exactly is it?

The term virtual production encompasses the tools, methods, and services for creating immersive experiences that combine real time virtual content with live video.

The key element here is “real-time” – we’ve been adding virtual content to video in the form of VFX for a long time in non-real-time post-production. Driven by developments in gaming tech, the engines and toolkits to combine high quality, photo-realistic elements in real-time to live video are now more accessible than ever. 

Like any rapidly developing technology, the vocabulary around virtual production has been subject to many different interpretations and definitions. Indeed, even the term “virtual production” itself has often been confused with “virtualized production” and even “remote production” — both related, but quite different topics to virtual production. 

Advertisement

Perhaps the easiest way to start understanding the terminology is to look at the main types of virtual production in use today.

Extended reality

XR is dominating column-inches when it comes to virtual production and is also one of the most confusing terms. The VP Glossary, created by the Visual Effects Society and the American Society of Cinematographers, defines XR as “an umbrella term for virtual reality (VR), augmented reality (AR), and mixed reality (MR), and all future realities such technology might bring.” This is a very broad definition, and the same glossary actually equates XR to virtual production. 

However, most commonly, the term XR is used to describe the use of large LED volumes to create virtual environments that real talent can interact with. 

The VES/ASC VP Glossary uses the term ICVFX “In Camera Visual Effects” to describe these types of productions. This term can be a little misleading because the effects themselves aren’t actually in the camera. But as a term, it does highlight the fact that there is a relationship between the position and movement of the camera and what is displayed on the LED volume.  If you think of a camera within a room, pointing out of the window, what is within the field of view of the camera outside of the window will depend on the cameras position within the room. The same is true for XR or ICVFX – the engine compositing the virtual environment receives data on the position of the camera and what is shown on the LED volume changes accordingly.

It is also possible, with more advanced setups, to have what is seen on the LED volume by the talent “on set”, to differ from that captured by the camera or cameras. This can be useful in live (broadcast) productions to give additional information to the on set talent to that on the final output and/or to provide different variations on the output for different markets or audiences. 

Virtual studios

Also referred to as “green screen” productions, VS is one of the most common forms of virtual production. A green (or sometimes blue) background is used around the talent such that it can be “keyed” out and replaced by a virtual environment.

As with XR, we need to track the camera location and movements to make sure that real and virtual elements interact. In more advanced scenarios, we may also track the talent and “map” them into the virtual world such that they can interact with the virtual environment. 

Some have suggested that the development technology for XR will be the end of VS/green screen, but the cost and complexity of LED volumes compared to the relative simplicity and accessibility of green screen technology would indicate the Virtual Studios are going to be with us for some time. 

Set extension

Set extension uses virtual reality to extend a physical set or environment. This is a great technique for giving productions a sense of scale, even where they are produced in a fairly limited space, and has been used extensively on talent show formats such as “The Masked Singer.” 

Augmented reality

AR productions that take a virtual element and place them in a real-world three-dimensional environment. A well-known example of this would be the giant virtual version of the Carolina Panthers panther jumping around the stadium and tearing up the competition’s flag. 

For such productions, a virtual representation of the real-world environment is created, though not seen by the viewer. The virtual element (such as the panther) can interact with this virtual representation as it would the real world equivalent. Again, the camera movement and position is tracked to ensure that the virtual and real worlds stay “aligned”.

Broadcast graphics

The line between broadcast graphics and AR can be a little blurry. While we still see quite a lot of simple 2D graphic overlays in broadcast, 3D graphics have been commonplace for some time now as well as featuring dynamic and data-driven content. One way to visually differentiate broadcast graphics from AR would be that AR elements tend to interact with the real world scene, whereas broadcast graphics would only sit in or over the scene.

However, the major difference between the two is the workflows around the elements. Broadcast graphics will typically be based on templates with data-driven content designed to provide the view with information to complement the programming, whereas AR elements are typically creative elements that are there to engage the viewer with the programming itself. Additionally, unlike the other types of virtual production mentioned above, broadcast graphics doesn’t always require camera tracking.

Advertisement

Mixed reality

Mixed reality is simply a mix of any and all of the above virtual production techniques. It is quite easy to imagine how AR or broadcast graphics elements might be added to a production created in an LED volume/XR or a virtual studio. We also see productions where virtual studio and XR techniques are combined — for example with a green screen floor and walls around an LED volume. 

The vocabulary around virtual production is growing all the time and while it may not be definitive, the work by VES and ASC on the VP Glossary is a worthwhile starting point for anyone looking to join the discussion on virtual production. It’s also worth looking into the resources being created by the SMPTE RIS initiative around on set virtual production. 

In the next article in this series, we will look into some of the main components and features of virtual production and some of the benefits of using virtual production techniques.  

Ben Davenport, PixotopeBen is a B2B marketer with a keen interest and understanding of the technologies that underpin the media and entertainment industry. During the past two decades, Ben has played a key role in some of the most complex and progressive file-based media solutions and projects in the industry while enabling leading media & entertainment technology vendors to differentiate their brands and products. Having previously headed up Portfolio & Marketing Strategy for Vidispine - an Arvato Systems brand, Ben has recently joined Pixotope as VP Global Marketing. Ben holds a Bachelor's degree in Music & Sound Recording (Tonmeister) from the University of Surrey.

Author Avatar