Broadcast Exchange: The merging of film and broadcast storytelling tech

From virtual fans in MLB ballparks for Fox Sports to the unique particle animation open at the Super Bowl and even Fox’s digital avatars on “Alter Ego,” Silver Spoon is at the forefront of live virtual production.

Silver Spoon brings visual effects tools often found in film production, such as motion capture and real-time animation, to broadcast.

Laura Herzing, executive producer at Silver Spoon, joins the Broadcast Exchange to talk about the merging of film and TV production techniques, the rise of extended reality storytelling and the industry-wide impact of the Unreal Engine.

She also gives us an update on some recent projects and an upcoming launch at Silver Spoon. 

Listen here or on your favorite platform below.

 


Listen and Subscribe

Video: YouTube

Audio: Apple Podcasts | Spotify | TuneInPocket Casts | Amazon Music

Advertisement

Show Links


Transcript

The below transcript appears in an unedited format.

Dak: Welcome to the Broadcast Exchange from NewscastStudio. I’m your host Dak Dillon on the Exchange. We talk with those leading the future broadcast design technology and content. Today I’m joined by Laura Herzing of Silver Spoon.

Silver Spoon is at the forefront of design and technology, creating unique extended reality, and augmented reality experiences for CBS Sports, Fox and other broadcasters.

Dak: Thank you for joining me today. I suspect many listeners will not be familiar with Silver Spoon, but many will have seen your work. you have worked on some entertainment programs. You’ve done a lot recently in the sports industry. so help introduce us to Silver Spoon and what you all are.

Laura: So in a nutshell, Silver Spoon creates real-time augmented reality and extended reality (XR) content through Unreal Engine. We got our start about six or seven years ago as a motion capture studio that was primarily serving other VFX studios.

The commercial and entertainment industries, and through our work in motion capture, we became a very early adopter of Unreal and really quickly realized the power that this program had, not just for real-time preview, but for final pixel as well. So in those last few years, we’ve really been in before.

Heavily in our real-time pipeline, to build a pipeline for using Unreal and using real-time graphics in, in the final execution. so at this point now we’ve really expanded our services to offer everything from creative content development, integration, and operation for real time, AR and virtual production across a very broad range of industries, including Brooke.

Dak: Yeah, you brought up the Unreal Engine. It is crazy how much it has swallowed up the industry in the past two or three or four years where it was, oh, maybe one company used it and now suddenly it’s become the standard bearer. You know, how does it fit in to your overall growth and where do you see it going from here?

Laura: Um, Unreal Engine has been integral to the growth of our company. It’s, it’s really something that we. I saw so much potential in when we started using it for that limited use, we were really using it for real-time preview during motion capture. But as the program has become so much more advanced and offered easier integration.

Everything from, you know, realistic looking humans to visual effects, motion graphics, and really expanded the capabilities that the program can offer. I think we’ve really expanded the capabilities of the work that we’re doing right along with it. So I think the growth pattern of the Unreal Engine and Silver Spoon have been very parallel.

Dak: Are you already taking advantage of Unreal Engine’s MetaHumans?

Laura: We have. Yes. Yeah, I think real time realistic digital doubles is something that there’s just so much interest in. And being able to utilize a tool like MetaHumans to create. Generic digital doubles and more specifically like real-time humans that look like a specific person.

Advertisement

That’s something that we’ve just seen so much interest in and yeah, we’re, we’re actively arguing.

Dak: Yeah. I mean, you know, Keanu Reeves talking about with the matrix and, and some of these new advances, it’s, it’s crazy that in the future, you know, art is the anchor you’re watching on the nightly news real, are they synthetic or will we ever even know?

you know, In terms of the pipeline, Do you see anybody catching Unreal or have they just kind of run away with the game?

Laura: For our uses. I think Unreal is really the only option.

I know that unity is, is well utilized in some similar industries, but not specifically what we’re doing. I think that a lot more have a foothold in, in virtual reality. But I think for. Real-time broadcast AR for virtual production. We’re really seeing that Unreal is, is the clear leader.

Dak: So one of the interesting things that I’ve thought about with Silver Spoon is, you know, you start on the visual effects side, and then now you’ve kind of went into these other parts of broadcast and it’s creating this unique synergy that not a lot of other firms have where, you know, maybe they’re doing broadcast design work, or they’re doing.

Virtual work, but you’re really bringing a lot of different disciplines in terms of design together in a package.

how do you think that the industry is changing in terms of its thinking around these different disciplines and the way that they interact?

Laura: I think it’s, it’s becoming so much more widely accepted to utilize real-time production tools and virtual production tools.

I think before the pandemic, honestly, it was really something that I think that these tools. Relegated to, big budget films and big name, producers and production uh, broadcasters. And now it’s something that I think the pandemic kind of forced the hand on a lot of producers and broadcasters to try something different and to be open to different ways of creating their content and has made Unreal, more accessible and more.

Widely accepted. So I think if you think about like, You know, traditional motion design that was really limited to like a 2D graphic on screen. Now those graphics can be fully 3D. They can be AR and really feel like they’re living in the real world rather than just being applied on top of the screen.

If you think about visual effects that, you know, traditionally you shoot on green screen and then you’d be doing very heavy compositing and putting all of your visual effects and environments and in post. Using a virtual production pipeline. You can get so much more of that in camera. So just having so much more of these tools readily available, not just for the highest end productions, but for more broadcasters, even commercial production, we’re seeing huge interest in it.

Things where we’re, you know, maybe even just doing a one or two day shoot that can still utilize a lot of these technologies.

Dak: Has it taken a lot to convince your clients that it’s okay to do these in-camera and not necessarily have to rely as much on the post-production

Laura: I think yeah. With any new technology. I think there’s a little bit of a learning curve and a time of uncertainty and what, before it’s fully adopted whether that’s for the client or for the people on set the DPS who, who may be. Unsure of using this, this workflow or uncertain of how exactly it can benefit them. But we found that the more that we’re able to just show it and go through the process and, and have everybody kind of see how it works, then the acceptance and the adoption of it kind of comes naturally out of that.

I think it’s just kind of getting over that initial hurdle of this is different. We’re not sure how it’s going to work and finding the right clients who are willing to just. Let’s go for it, you know, and, and be an early adopter and try something that, that maybe they’re not a hundred percent familiar with.

Those are the types of projects that I think help push that acceptance for other brands and other broadcasters who maybe need to see that, okay, this has been done. Let’s try it on our stuff.

Dak: Yeah, you mentioned early adopters. So last year at the Super Bowl, you all obviously helped CBS Sports, do some motion capture and that’s something that’s not really been done at that level in terms of, national sports, it’s, it’s usually reserved for movies or something where it’s very specific, very calculated. And then in this case, you’re merging those different disciplines.

Laura: Yeah. Superbowl last year was, was an awesome project. I mean, we got to really elevate the visual design using AR graphics and real-time particle simulation and camera fly-throughs and things that just hadn’t been done before. And CBS has really been a. Awesome partner in working with us to do AR graphics, not only in their live broadcast in the, in-venue, but also bringing AR into their studio shows as well. You know, virtual sets are, are very commonplace now in sports broadcasting.

But last year we had the opportunity to work with CBS again on NCAA March Madness and actually bring these little animated real-time animated AR characters into the studio.

Just, just kind of add another dimension to their, to their studio broadcast for March Madness and yeah, those types of projects, you know, when you find the right clients and the right broadcasters who are willing to push the envelope and try something new, it can, it can be really great. And it can really add to the, the viewing experience.

I think our goal always with AR is to enhance and not to take away from the. Intent of the viewing experience. Are they the game that you’re watching?

Dak: Now many of these technologies, sports productions have usually led the way and brought them to the forefront first and foremost, before they then filter their way down to, news or some other types of broadcasts.

Laura: Yeah. Yeah. I think sports, you know, sports are well-funded sports are fast-paced. They’re broadcast has a relatively short lifespan. So I think that sports broadcast. Have always been tech-forward and are more inclined to be early adopters and try something new. And I think that because kind of, because of the nature of their broadcasts, I think they feel a little bit more empowered to, to try these things that other broadcasters or other platforms might see as risky

I think a great example of that is what we did during the pandemic with Fox sports, for their virtuals crowds. I mean, that was a completely unprecedented use of AR technology. Real-time camera tracking in live broadcast sports to meet like such a specific need that nobody could have ever seen coming, but when it was here and, and we had that opportunity in front of us and really that challenge in front of us Fox was the network that stepped up and.

Yeah, let’s try this. Let’s let’s see how it works. And that’s led to innovation and in a lot of other areas of AR broadcasting and sports broadcasting I think that really just kind of broke the ice on the possibilities for a lot of different areas.

Dak: So what does something like the virtual fans? Where, where does that lead to?

Laura: I mean, I think the possibilities are endless. I directly out of that you know, we’ve worked in, in sports. I, I already talked about the, the super bowl and March madness. I think both of those projects were kind of a direct correlation to what we did with Fox and the AR crowds.

That also kind of. Specifically a Silver Spoon into the entertainment industry and doing studio shows like Fox’s “Alter Ego,” which just finished airing. It was the first real-time AR singing competition. So all of the characters are real-time animated. There’s real people backstage driving them through live motion capture, and then they’re composited onto the stage in real time.

It’s all happening in camera. And that. Is a natural progression of where we started with the AR crowds taking it to a much more artistic execution. But still all the, all the principles of how we did it are the same. It’s just you having to take it to the next level.

Dak: So talking about “Alter Ego” a little bit more specifically, you know, tell us a little bit about what goes into that production to allow that to happen each week. You know, ’cause it’s, it’s not like “The Masked Singer” where they just have to wear a costume.

Laura: I mean, in a, in a workflow like this pre-production is so important. Because like you said, it’s not just putting on a costume you’re, you’re creating these custom avatars. Costumes that they’re wearing need to be created ahead of time.

So what you actually saw air was just a small fraction of the boardroom that we had built for these avatars, because you don’t know from week to week, which one of them is going to advance. So planning and pre-production is a huge part of it. And then having time onsite to really integrate systems is I think just so important, especially since AR motion capture.

These things are not part of a typical broadcast pipeline, right? So in fully integrating into a setup for broadcast TV show where we’re moving very quickly the shoot days are close together. There’s not a lot of time to change things on the fly. Having the workflows kind of worked out and prepared ahead of time, allowed us to meet that demand and keep up with that rigorous.

Reality show shooting schedule. And yeah, it was important for everybody to just kind of be on the same page with this, from the start, and then also for Fox as a network to be open to adding this whole new dimension to their workflow and, and working through, you know, the stumbling blocks that are naturally there when you’re putting something completely new in your package.

Dak: So for a show like that, or you pre pre-baked the textures in for these avatars or is this something where it’s, it’s pulling, picking in the lighting from the environment they’re in how’s that.

Laura: The textures are, are baked in. So, you know, we, we had a whole team of four drove artists that were creating the clothing that the avatars are wearing.

So those, the avatars that are in engine are fully textured and then the lighting is happening in real time. So as we were working with Lulu AR who was the vendor who handled the AR composite, they were, they had a DMX operator. You know, controlling the virtual lighting to match the real world lighting and all of that is happening in real time.

So for every performance every year in your interview, the lighting scenario is unique and that’s being controlled by, by a person, by an operator who’s making the virtual world fully realistically integrate with the practical world that we’re seeing on stage.

Dak: I had another interview recently talking about one of the studios for the Olympics and they brought up that they think in the future, there will be a kind of a virtual lighting designer assigned to all of these types of projects, because you know, in the past the scenic designer, whoever was modeling, it would just kind of decide what the lighting was gonna look like.

But now. Cohesion and that realism, you have to kind of layer it in so much.

Laura: Yeah. A hundred percent. Yeah. I think that that is an essential role. If you’re working with prac with a mixture of practical and virtual light, Just like you need a lighting designer to do the practical lighting on your set.

You need a lighting designer to do your virtual lighting and ed to make sure that those two systems are working in tandem and are cohesive so that the, the AR whatever the AR is that you’re trying to put into the real world looks naturally integrated. Cause you know, otherwise you’re gonna end up getting a result.

The eye catches and that breaks the illusion, unless it, unless it is very closely.

Dak: Where do you think this all takes us? You know, it’s been a very fast progression, now to see it also in the entertainment sphere on these kinds of day turn shows that are, are quick, turn shows you, where does this lead us to?

How does this impact future storytelling?

Laura: think you’ll see a lot more of it. Reality, I think you’ll see a lot more of it in scripted television. You know, the idea of kind of having your real life, your in-person life and your metaverse life or your online life, or, you know, that that’s an idea that’s becoming so commonplace and so mainstream, and, and I think that we’ll see a lot more of that type of storytelling in in narrative.

Films and movies. So I think that the production of that lends itself very well to what we’re doing. We’re, we’re basically creating that metaverse in that, on that, on line persona. So yeah, I think that, that, that’s definitely something you’ll see. I think too, like you were saying at the top of the call, you know, having virtual.

People and avatars in places where you might expect that it would be a human whether that’s a host or a newscaster. I mean, we’ve seen a bunch of virtual influencers already, you know, take hold and really build fan bases. So I think we’re not that far away from that where we have either realistic or stylized, you know, we have the avatars who are kind of coming into our lives in and are accepted in ways.

Humans would have been, or, or, you know, are in those roles previous.

Dak: Yeah. I mean, It’s a topic for a different day, but obviously when you start synthesizing real people, you know, it raises a lot of ethical things, but that’s, you know, that’s, that’s for the journalism folks to figure out, not in this.

Laura: Just say we’re talking about stylized avatars only. We’re not talking deep fakes

Dak: what obstacles do you see right now that are still there? Is it compute power? Is it the fact that you can’t get an Nvidia engine?

Laura: I mean, yeah, the very tactically supply chain issues are a challenge. It’s difficult to get some of the hardware that we need or that anybody would need to do this.

But now I think, you know, those are, those are solvable problems. And I think that those are things that will resolve themselves. I think the. The biggest stumbling block that I would see is just acceptance. And I think that those, those walls are starting to come down. I think the more successful uses of real-time technology that you see that become more prominent.

I think it’s just going to be a cascade of. Interest not just in broadcast, but I think, you know, there’s so many different industries stage and theater, experiential installations, museums music, videos. I think there’s just so many different ways that this technology is used and can continue to be used that once that challenge of accepted acceptance is overcome, it’ll just be you know, the flood gates will open.

Dak: So, where are you looking for inspiration today? Prepare these future experiences.

Laura: That’s a good question. I think it’s variation can really be found everywhere. I mean, seeing what other people are doing with the technology. I love watching just trying to watch TV or videos or movies and, and figure out how each shot was made.

I think anybody. Is in production or isn’t in, in the industry, probably struggles with that. Or you get a little caught up in the details. So I think just, you know, looking around and taking inspiration from what other folks are doing. It is always a great place to start and finding personally, sometimes I kind of just like to walk away from it too.

You know, I, I find that sometimes the best ideas come when I’m like out for a walk or cooking or doing something completely unrelated to screens and real-time content and the industry. But giving your mind and yourself the space to kind of absorb everything that, that you’re taking in all day.

That comes you come back to your work with a new purpose and with a fresh mind that new ideas can kind of emerge when you step away from it, especially now more than ever, you know, I think we all need to get outside a little bit more.

Dak: What’s next for Silver Spoon? Where are you all headed?

Laura: We are. Really looking into the future. I mean, we we’ve had such tremendous growth over the last couple of years. And like I’ve been saying there’s just so many different industries where I think real-time content and these technologies are applicable, but specifically.

We are just about to open our XR stage in Nyack, New York.

So we are building a volume of our own taking all of our learnings and virtual production and, and creating a space where we can shoot and continue to R and D projects there. We’re also really getting focused in, like I mentioned earlier, large-scale installations. So whether that’s projection mapping or using LEDs taking real-time content and making it, taking it out of the screen and taking it into real life in venues and experiential spaces where people can interact with things in real time and, and kind of get that person.

Experience that we’ve had, we’ve seen on screen, but now we want to bring it into, into real life and hopefully yes. Being able to do things and in real life and in person becomes more open and available to us. Hopefully we keep saying that, but I think we’ll be right there and ready for it. When that, when that time comes.

And then, yeah, just continuing to expand our footprint in broadcast entertainment films. We’ve had some really awesome partners in both sports and entertainment world that have accepted, you know, the, the challenges of doing things for the first time, but also rate some of the benefits of being early adopters and able to try things for the first time.

So I think that, again, as we get over that hurdle of acceptance, those industries are just wide open for us.

Dak: On the live event side, there’s obviously going to be a lot of pent-up demand that will eventually come cascading back to a lot of these type of been used and, and settings.

It’s just a question of when, you know, in the UK, they said we have six more years ahead of us. So hopefully that is not the case of pandemic world, but well, thank you so much for taking some time out of your day to talk about where you all are headed and what you’re up to, you know, we’ll be on the lookout for more.

Laura: Awesome. Thank you so much.

Dak: Thanks for joining us today on the Broadcast Exchange.