Q&A: The importance of scalability and low latency in live video streaming
Subscribe to NewscastStudio's newsletter for the latest in broadcast design and engineering delivered to your inbox.
As the world of broadcast continues to move towards a streaming-focused ecosystem concepts like low latency delivery and microservices become increasingly important.
As part of our Focus on Emerging Technology, we recently had a chance to speak with Stefan Birrer, PhD., the CEO of Phenix, a provider of real-time IP video solutions for broadcasters.
Phenix’s video solutions include PCast, a scalable video streaming platform, along with a platform that transcodes live streams into on-demand content as it airs. In our discussion, we addressed some of the important broadcast engineering questions facing transmission, including low latency, new codecs and microservices.
How are new codecs impacting live streaming?
Codecs have different properties that affect efficiency and quality.
Lowering bitrate and/or increasing quality with a wider range of applicable technical use cases will significantly contribute to successful streaming deployments.
What about microservices and scalability?
It’s the pillars of which any modern system should be built on. Scalability allows dealing with the size of viewership required for streaming deployments.
Microservices enable a flexible and easy to maintain a system that allows continuous investment in innovation and fast go-to-market execution while maintaining enterprise-grade quality.
In sports, we’ve seen a large move to at-home production. How does that impact deliverability?
At-home production is a model that suites well with distributed real-time technologies, True real-time work very well with these types of use cases.
What is the biggest challenge when delivering a stream in real time?
Every problem is an order of magnitude harder as there is less time to make decisions and room to err. There is a focus on solving every problem in an end-to-end perspective to enable the lowest latency delivery.
What can be done from a broadcast standpoint to ensure low latency?
Broadcast entails distributing one signal to many viewers. This brings along additional challenges of managing a potential large system without incurring additional latency. At its heart, the solution appears simple. Deliver every packet to every viewer as fast as possible.
In order to that in a large system, scale and reliability challenges have to be solved. Every element in the critical path has to be designed from an end-to-end perspective to not introduce new problems or latencies.
How cost effective can streaming be compared to traditional delivery methods? But, how does that impact viewership?
Streaming is as cost-effective as any other secure delivery method at scale. It will positively impact viewership as it is a superior experience for the viewer.