Column: How games are making virtual studios more realistic
The virtual set has come a long way since it first appeared on our screens. Once confined by technology and processing power to small, intimate spaces, they have expanded to be able to show pretty much what the producer demands of them. Along the way they have transitioned from being painfully obvious to almost undetectable, with sometimes only the knowledge that what you’re seeing on screen is not really possible, giving the game away.
While already a firm mainstream choice, they are currently undergoing a further boost in quality that is starting to make them genuinely photorealistic and a potential choice for even more programming genres. This is another facet of the ongoing collision between IT and broadcast, this one presenting the integration of powerful 3D game engines and the specialized (though still commodity and cheap) hardware that can get the most out of them into broadcast solutions.
To date, virtual set development has been undertaken by talented high-tech teams working for a handful of leading players in the market. What the increasing inclusion of game engines into their offering and elsewhere is doing is enabling them to piggyback on the efforts of the enormous R&D teams that are driving game engine development forward at a frenetic pace.
Even if you’ve not played Fortnite, you will have heard of it. That game is made by Epic Games, and its real-time graphics are powered by the company’s Unreal Engine. So are many of the graphics of the leading computer games being released this year and an increasing amount of photorealistic renders and immersive AR and VR experiences for architecture, automotive, film & television, training & simulation, and other industries outside of gaming as well.
The acceleration is similar to the one that we have witnessed, as broadcast as a whole has become more IT-centric. The old proprietary, black-box technologies have been first overtaken and then swamped as we have opened up to the wider development of the CPUs, storage solutions, and networking technologies driven by the giants of the IT world.
In terms of virtual sets, typical broadcast-specific solutions can take you so far, but they have trouble with that last 10% of quality that takes an exponential amount of time in order to enter into the world of photorealism. But with the game engine’s ability to produce some of the hardest graphics elements such as fur and hair, and even perform techniques such as ray-tracing in real-time when coupled with some of the latest GPUs from the likes of Nvidia, this is what the game engines such as Unreal Engine excel at. Furthermore, they have been designed from the ground up for world-building and feature a wide array of tools to enable artists to swiftly generate excellent looking assets and animation. There is also an increasingly wide talent pool of artists to choose from, with training courses for an increasing number of industries ensuring that people graduate with an ability to use game engine interfaces.
And, and this is possibly the final kicker, with a couple of caveats they can be free to use too.
Given all that it is not much surprise that progress is rapid. At NAB in 2016, we showed a demonstration of Unreal Engine working as a compositing engine with our data, and now we see Tier One broadcasters using game engines to generate virtual sets on a daily basis. Some of the virtual environments we are able to produce as a result of the technology are astonishingly good, and made even better by the fact that they are being generated — and can thus be iterated and evolved with feedback from in front and behind the camera — in real-time.
The possibilities are really exciting. Games and game engines are getting better all the time — just look at the difference between the original Final Fantasy VII in 1997 and the remake to be released in April, for instance. Virtual sets and graphics are now leveraging that same progress, and over the next year or two we are going to transition from virtual sets being ‘almost undetectable’ to genuinely undetectable. And once the viewer can no longer distinguish what is real and what is CG, from talk shows with CG guests to digital avatars to advertising and more, that opens up a huge range of new creative possibilities for the industry.
About Nic Hatch
Nic Hatch is Chief Executive Officer at Ncam Technologies. After studying for a BSc in Engineering Product Design. Nic joined Mill Film in 1999 as a computer graphics artist. After setting up the in-house previs department for Warner Bros. and Industrial Light & Magic for the second Harry Potter, Nic joined the Moving Picture Company in London, before leaving in 2004 to start Nvizage Limited. Nic spent the following eight years helping to raise the standard of previs, supervising shows such as “The Dark Knight”, “Alice in Wonderland” and “Charlie & the Chocolate Factory”, before starting London’s leading VFX boutique, Nvizible, in 2009 and more recently Ncam Technologies in 2012.