Column: Virtual production camera tracking and what you need to know

By Ben Davenport, Pixotope

Virtual production is booming, and for good reason. It’s being used more and more by filmmakers, broadcasters, and pretty much anyone who wants to create immersive experiences with real-time virtual content. By combining the flexibility of virtual worlds with the ability to visualize content earlier in the production process, studios are given new possibilities to realize and refine their visions.  

Virtual production relies heavily on real-time camera tracking, which synchronizes live-action footage with the virtual camera so that 3D data and live-action footage can be seamlessly integrated.

In virtual production, there are many methods to create virtual environments in which natural talent can interact. For example: 

  • Augmented reality (AR): Real-time overlaying of virtual objects in real-world environments.
  • Virtual studios (green screen): Allows real objects and real people (actors) to interact in real-time with computer-generated environments and objects seamlessly.
  • Extended reality (XR): Using large LED volumes to create virtual environments where real-life talent can interact.

Regardless of the method you choose for your virtual production project, quality camera tracking is essential for your toolkit. In addition, as processing power and software capabilities have improved, camera tracking has become more affordable and easier to implement. 

So, what is camera tracking and how can you use it in your next project? Well, here’s a breakdown of some options.

Mechanical vs. optical camera tracking

The first thing to note is that camera tracking systems fall into two categories: mechanical and optical. 

Advertisement

With their long history of use in live broadcast AR, mechanical tracking systems are still considered to be dependable and are the mainstay of many sports and live events, especially in larger arenas.

With sensors placed on the camera rig, mechanical systems deliver precise camera movement data: including its position and orientation while additional information from the camera itself details the lens parameters. This ensures that virtual objects appear as part of the real environment, and vice versa, throughout every camera movement.

Thanks to the sensor-based nature of mechanical tracking, you can use these systems in situations where optical tracking isn’t possible. In locations where you cannot place markers, or where there are few identifiable natural markers (see below for more details), mechanical systems are ideal: such as a putting green in golf, which is entirely green.

However, mechanical systems can require significant setup time and they require regular realignment. It is also necessary for the camera rig to have a fixed origin from which the movement of the mechanism can be referenced, limiting flexibility. 

Optical camera tracking systems, on the other hand, can give far more freedom to move the camera around. Cameras with optical tracking can be moved freely; you can even use a steadicam. 

As the tracking sensor typically points upwards or downward rather than into the set, it is unaffected by studio conditions such as lighting configurations, reflections, and plain green screens. Setup can also be relatively quick and easy. Optical systems come in a couple of different flavors.

Marker-based camera tracking

As you probably guessed, marker-based systems rely on markers placed around the scene or, at least, in view of the sensor. They are used as points of reference to calculate the exact position and orientation of the camera.

Markers can take several different forms. One form is a pre-calibrated coded floor, where ‘absolute’ markers remain in the same position. As the markers are already in place on the roll-out floor, no calibration is needed, so setup is quick and easy. 

Using absolute visual markers ensures a clear and absolute reference, quick initialization, and interference-free tracking. They also require a lot less horsepower than markerless systems, or systems that have ‘random’ instead of absolute markers, making them more affordable than alternative options. A pre-calibrated marker floor can be used both indoors and outdoors, so it’s incredibly versatile, and a budget-friendly way to get started with virtual production. 

Alternatively, you can combine absolute markers with carefully placed relative markers to pinpoint the camera’s position with the same camera and sensor setup. 

Countering low, or variable lighting conditions, infrared retroreflective markers reflect infrared (IR) light from the IR lamp onto the sensor, ensuring that the sensor is unaffected by low or changing light, while remaining hidden to the human eye and the main camera itself.

Setup is painless as well–you just need to install the markers and carry out a simple self-calibration process. However, due to the need for individually placed markers, it’s best to keep this method for studio-based productions. 

Advertisement

Markerless camera tracking  

A markerless camera tracking system uses real features in the natural environment instead of artificial markers, which gives it greater flexibility. Using a sensor on the camera, the scene is captured and analyzed in real-time, generating 3D feature points that are then used instead of artificial markers. Once the 3D points are generated, an automated tracking algorithm then calculates the camera’s position and parameters. 

Markerless systems give you freedom for creativity as you can move the camera without restriction as well as being far more practical for use in outdoor or large indoor spaces. 

Markerless camera tracking technology can, in certain scenarios, also be used Through The Lens (TTL) without the need for a sensor. In this case the tracking uses the video stream from the camera to determine the camera’s position in 3D space in real-time. This is incredibly powerful for applications such as drones, wire cams and action cameras where adding sensors and/or markers is impractical. 

Tracking in LED volumes

One of the challenges with Virtual Production using large “wraparound” and multi-plane (floor/ceiling) is that there are very few natural reference points for tracking to pick up on, or anywhere to place markers. In this scenario, we need a different solution.

With the high refresh rate of modern LED volumes, one such solution is to “hide” the tracking markers in the LED volumes themselves. By using absolute marker patterns, and displaying them on the LED volume between the frames captured by the cameras, we can achieve highly accurate and reliable optical tracking. We can also go one step further by also displaying the negative of the tracking markers, meaning that patterns are invisible to the human eye, and therefore not a distraction for on-set talent.

Regardless of your chosen method, camera tracking is an integral part of virtual production workflows. With virtual production continuing to grow in popularity as a means of creating content, it’s imperative that the right tools are available to meet the needs of each production.

Ben Davenport, PixotopeBen is a B2B marketer with a keen interest and understanding of the technologies that underpin the media and entertainment industry. During the past two decades, Ben has played a key role in some of the most complex and progressive file-based media solutions and projects in the industry while enabling leading media & entertainment technology vendors to differentiate their brands and products. Having previously headed up Portfolio & Marketing Strategy for Vidispine - an Arvato Systems brand, Ben has recently joined Pixotope as VP Global Marketing. Ben holds a Bachelor's degree in Music & Sound Recording (Tonmeister) from the University of Surrey.

Author Avatar