Q&A: Augmented reality, efficiencies and the cloud with Avid
Subscribe to NewscastStudio's newsletter for the latest in broadcast design and engineering delivered to your inbox.
We had a chance to speak with Avid Senior Director of Broadcast Products Ofir Benovici during the NAB Show about the move towards the cloud, augmented reality in broadcast news and how their organization is working to improve efficiencies for graphical output.
How does the cloud and virtualization factor into the broadcast world right now?
There is a fine line between what can go onto the cloud and what still needs to stay on premise. That could be how you’re driving your monitors, or whether you have a switcher, or where you’re playing your stories from.
I think, to me at least, this is something that could be debated, the line crosses once you have your playlist ready. Everything that you’re doing from planning, et cetera, all the way to the point that you have a playlist available, that’s where the cloud efficiency is very powerful. Then, when you go downstream into the production area, then you are working on premises, for the foreseeable future, you will still need to have your physical devices sitting in the control room running the actual production.
If you’re a journalist, you can do your entire pre-production from the cloud. We take the benefit of that, and then when you go into your live production, then that’s where you’re coming back into your more traditional production environment.
Is that because of bandwidth, redundancies? Latency?
I don’t know if you can, today, guarantee latency when you’re playing from the cloud your news production. Besides, I think there’s also the human element that would be missing. There is this constant dialogue between the producer, and the director, and the talent. I don’t see that going away in the next few years. You still need to have this connection. There is the technical side, which I think is important, and it’s latencies, but there’s also almost like a cultural and the way that TV’s being produced.
Augmented reality has been a buzzword since 2008 at NAB, has it finally reached a critical mass in broadcast?
We are seeing great interest in AR. In 2008 in your example, AR elements would be used for elections and special events. What we are seeing now, the AR elements are working their way into the day to day production.
I think that the main reason for that is it comes back to the competition in viewership, and the ability to tell a better story. Because of the progress in the last few years, it’s much easier now to produce AR elements and incorporate them into the day to day production.
Now your talent can tell a much better and much more compelling story using AR elements where they can illustrate what they’re talking about rather than describing it.
How does the second screen impact new technology like AR?
Since it’s becoming easier to produce, you’re seeing it more. I won’t say any newscast, but you see it more and more in the day to day production.
You need to have an eye catcher. You want to have your 30-second segment about a topic, it’s much easier and more engaging to do that with an AR element that is telling your story rather than having someone stand there for a stand up with a microphone. [inaudible 00:07:01].
What’s the biggest challenge today for broadcasters on the graphics side?
I think the challenges are less about the design. It comes down to efficiency because in today’s production, you need to create more graphical elements.
Just look at the average TV screen today versus five years ago, and see how much real estate graphics are occupying. Then you also need to distribute this content, not just for your first stream, but also the social media outlets, so on. You need to take into account different aspect ratios and what is good for TV screens is not necessarily good for that.
What about workflows and efficiencies as studios incorporate both video walls and AR?
We are seeing tendencies were you want to control the content on the video wall and at the same time the AR elements. The request that we’ve been receiving from a lot of customers relates to the two different interfaces. They want one interface to control both.
Our latest version of Avid TD Control is actually handling that. From a single interface, you can control both the elements of the video wall, but also the AR elements.
On one hand, this is efficiency, but perhaps more importantly, since this is the same system and the same engine, you can establish logical relations between the background elements and the foreground elements.