Veritone’s Sean King on how metadata is the backbone of live sports broadcasting
Weekly insights on the technology, production and business decisions shaping media and broadcast. Free to access. Independent coverage. Unsubscribe anytime.
With the Super Bowl, the Winter Olympics and the FIFA World Cup all packed into the same calendar year, broadcasters face a familiar challenge at a growing scale: how to manage, search and distribute massive volumes of live content across an expanding number of screens and platforms.
For Veritone, the answer starts with metadata — the structured data generated from audio and video that identifies who appears in a clip, what is happening, what logos are visible and when specific moments occur.
“Those are like the building blocks of everything,” said Sean King, chief revenue officer and general manager of Veritone Commercial. “It all starts in foundational data.”
That information makes content searchable, sortable and ready for distribution. Without it, the sheer volume of material produced by a major sporting event is difficult to navigate at speed.
Veritone is not alone in building its business around this idea. Multiple broadcast technology vendors identified metadata tagging and content indexing as among the most common AI applications now in active use across the sector in a recent NCS roundtable.
Veritone ingests hundreds of thousands of hours of content each day across its customer base, according to King. That content spans broadcast feeds, telephone recordings and body camera footage from law enforcement agencies. The AI analyzes the material on a frame-by-frame basis, generates metadata and feeds it into applications tailored to each customer’s needs.
“It’s all the same underlying products,” King said. “It’s just the workflow and the application layer in which they interact with it that is different.”
From one event to a thousand clips
The practical effect of that metadata layer becomes most visible during large-scale live events.
An Olympic broadcast can produce tens of thousands of hours of content across hundreds of simultaneous competitions. Broadcasters need to identify relevant moments, create clips and distribute them — often within minutes.
King said some broadcast customers now aim to produce thousands of clips per week. Much of that demand is driven by athletes or moments that may not be central to the primary broadcast but carry significant interest in other markets.
“There may be players in these events that are not necessarily relevant to the broadcast experience here, but are very important to what’s going on outside and in other areas of it,” King said. “Bespoke custom workflows that follow the events of a singular athlete and what’s taking place, and how those are getting identified, clipped and distributed to global broadcasting entities.”
That pattern — one live event feeding dozens or hundreds of derivative outputs — is increasingly common beyond sports as well.
King noted that similar workflows apply in news, where broadcasters produce hyper-local clips, and in entertainment, where short-form content has become a primary distribution format.
The multi-screen problem
The volume of content is compounded by the number of screens it needs to reach.
King described his own viewing behavior during a college football game: the television for the main broadcast, a phone for stats and a tablet for tracking another team. Each device represents a different viewing context and generates its own data.
“It’s just going to be interesting to watch how these things start to gravitate towards one kind of unified experience,” he said.
The financial incentive to solve the problem is direct. Live sports are funded by ticket sales and advertising, and broadcasters that paid for content rights need to keep viewers engaged across those screens to maintain the value of their inventory.
“There’s obviously a goal that you’re going to want to keep them entertained and keep the value of these things strong so they can continue to put on these events,” King said.
King pointed to the growing volume of unstructured data — video, audio and interaction data from each device — as a compounding factor. All of it needs to be converted into structured, searchable information before it can be useful.
“We’re making more and more unstructured data than ever before,” he said. “That has to get turned from audio and video into binary searchable data.”
Making the archive pay off
Beyond live production, King identified archival content as an area where the same metadata capabilities carry distinct value.
Broadcasters that have spent millions or billions acquiring content rights hold large libraries of footage that can be relicensed, repurposed or resurfaced — but only if the material is cataloged well enough to find what is needed quickly.
“Not only do people like live events and entertainment, but people also like to reminisce,” King said. “Being able to quickly access, ‘Hey, did this happen before? If so, when?’ And where are those accessible and finding those and being able to share them.”
Veritone’s agreement with the NCAA, announced in 2024, illustrates the point.
Under that deal, Veritone serves as the NCAA’s global archive of record and exclusive video licensing agent for championship content across all three divisions. The company catalogs and indexes the footage, then manages licensing to documentary, editorial, advertising and entertainment buyers.
“For these groups, it’s that ability to be able to continually monetize it as well,” King said.




tags
Advertising, Archiving, Automatic Metadata Extraction, Broadcast Archiving, Metadata, Personalization, Sean King, Veritone
categories
Broadcast Automation, Broadcast Engineering, Featured, Media Asset Management