Broadcast Exchange: The evolution of robotic camera systems in broadcast
Subscribe to NewscastStudio for the latest news, project case studies and product announcements in broadcast technology, creative design and engineering delivered to your inbox.
This season on the Broadcast Exchange we’ve explored the merging of broadcast and film production tools, such as through digital avatars and volumetric capture. A key piece of this merging is advancements in robotic camera systems.
While robotic cameras have been in broadcast facilities for decades, the Mark Roberts Motion Control team is pushing boundaries in storytelling, bringing technology tested in Hollywood to broadcast.
Paddy Taylor, head of broadcast at Mark Roberts Motion Control, recently joined the Exchange to talk about this merging of production toolsets, how AI and machine learning are helping advance studio robotics and how the pandemic has shifted perspectives on the technology.
Listen here or on your favorite platform below.
Listen and Subscribe
The below transcript appears in an unedited format.
Dak: Welcome to the Broadcast Exchange from NewscastStudio. I’m Dak Dylan. On the Exchange we talk to those leading the future of broadcast design, technology and content. Today I’m joined by Paddy Taylor, head of broadcast at Mark Roberts Motion Control. While robotic cameras have been in broadcast facilities for decades, the team at Mark Roberts Motion Control are pushing boundaries and storytelling, bringing technology tested in Hollywood to broadcast. Thank you Paddy for joining me today. Mark Roberts Motion Control was largely unheard of for a lot of broadcasters only four or five years ago. So how did you all get here to where you are today?
Paddy: It’s been an interesting journey. I’ve only been with the company for a couple of years. There was a sort of deliberate focus put on creating specialist solutions for broadcast maybe seven or eight years ago. A lot of experience within the company in terms of delivering broadcast solutions. In particular in a specialist workflows, either around motion control. So defining broadcast is always one of the challenges that we have. And we internally being known for motion and control, particularly for film music, videos, commercials, that type of thing. Creating those lines of engagement and establishing we mean by broadcast to us just internally is quite important. But particularly things like AFC head, which is a fairly simple robotic pan tilt head, but it’s extremely fast and extremely accurate and quite small and lightweight. And it’s actually used by broadcasters in live sports every single day, and there are hundreds of them deployed for that. Particularly things like the Australian Open, which is taking place as we talk at the moment. There’ll be a number of heads being used in those sorts of environments.
Paddy: So broadcasting, not totally new to us. If we start to define broadcast then maybe into the idea of studios, then that’s something that Mark Roberts has been less involved with. In particular, because they’ve been focused on the motion control side and delivering solutions that are highly repeatable, very high performance, but are necessary suitable for on air broadcasting. For example, a very popular bolt robot, it’s not at all quiet. It’s not something that any floor manager is going to accept in their studio. But we developed those same robotic solutions and other robotic solutions alongside them to be much more applicable to the broadcast market. So very smooth, very quiet, very stable. And of course critically, in the same way is the motion control robots, extremely reliable.
Paddy: And we’re finding customers now moving towards those sorts of workplace saying, actually partly because of COVID, but not exclusively, we want to automate more and more of our camera movement. We want what a jib or crane gives us. We want something interesting, something dynamic. But we can’t necessarily afford, it’s not so much cost, but we can’t afford to have more staff and resources in the studio. And in that case, you have to automate what you’re trying to do. Instead of conjunction with that happening at the same time, just being this move from more traditional studios through those sort of virtual chroma key kit sets into the LED volume stage. And that’s something that we understand extremely well. We do a lot in the volumetric world, capturing content for display in AR and other similar types of environments.
Paddy: And often we are capturing content that we are then interacting with. Because we are capturing the content in our volumetric automation stage system to then create an avatar or some form of character that’s going to be used in that live LED volume as part of the broadcast. And the other side of it, we’re providing the movement to capture that. So we understand that extremely well. And robotic arms are perfect for it, as are other robotic solutions. But in particular our understanding of those workflows and the encoding capability that we have within the system basically gives us something quite unique. And if someone’s trying to create a marque top tier studio using LED volumes to do that, then a robotic camera arm is a perfect solution to combine with that. Because it gives you motion that you can’t achieve any other way.
Dak: Nikon acquired Mark Roberts motion control. How has that maybe supercharged your research and development or put you all on a different path?
Paddy: And yeah, it’s made a huge difference. So some of the things that we are becoming known for, things like our automated tracking solutions. So we have a machine vision based, people throw around the terms AI and machine learning and things like that all the time. And they have a place in the conversation, but in reality what people looking for is smart software, software that automates workflows. So our machine vision system that uses limb detection. Actually Mar Roberts was looking at those solutions before Nikon acquired them, and was doing it in different ways. And by having Nikon as a partner, as a parent company, it’s enabled us to look into Nikon’s portfolio of solutions. So we have a number of products that actually leverage some of advanced sensor imaging technology that Nikon has, but also to develop in different ways.
Paddy: So one of the areas we put a lot of focus on over the last couple of years has been sports automation, and that’s using a combination of different traction analysis to automate camera movements. We have a contract with the DFL, for example, to do that. And that development and the speed of that development has been enabled by having Nikon as a partner. They open doors that we would struggle to without having them to support us. And certainly in terms of the R&D team that we have, and I don’t think it’d be as large it is without the investment that they’ve made. Recently this kind of discussion is sort of part of really started to bear fruit because we are starting to find ourselves now in things like the DFL and other broadcast applications, which Mark Roberts wouldn’t even be considered for three or four years ago.
Dak: What is your biggest market right now? Is it the entertainment side or is the broadcast slowly catching up?
Paddy: The broadcast is catching up. I mean, definitely what we call film and commercial, the traditional motion control, is still a large part of our business. And that’s still growing, and it’s actually growing quite significantly. It’s always an end joke that, I’m catching but they’re also accelerating themselves. So I’ve got to accelerate a little bit harder to try and grow the broadcast side of the business to keep up. So both sides of the business are in extremely good health. And then addition, our volumetric business is quite a new part of what we’ve been offering. It is absolutely cutting edge. There’s very few systems. I don’t believe there’s another 4K mobile volumetric stage certainly available in Europe. I’m not sure necessarily elsewhere. But we are very much at the cutting edge of developing these new systems.
Paddy: And that’s some of the broadcast in particular take notice of. And we are delivering solutions across the board at all ends of the delivery. So we could be providing the remote camera heads for a live sports event, for a sports broadcast. We could provide them the robotics that are being used for the actual studio operations, at the same time as having some AR capture as part of that in terms of creating the avatars for the actual sports people involved or as part of the sponsorship and branding that’s being associated with it. So all of that stuff is moving really, really quickly, but it’s great to be part of a company where every part of the business is firing all cylinders.
Dak: And you hit on this a little bit, but how are those technologies that were once reserved for that kind of cinema universe now filtering down more where they are being adopted in the sports world or in the news world?
Paddy: Yeah. We regularly get conversations started by a customer saying, I have seen, the Mandalorian until about a year ago was always the one, I’ve seen the Mandalorian. We’re really interested in doing what they did at the Mandalorian in a new studio. And instead of, okay. Yeah, I think you need to explain, oh, okay. So actually what you want is an LED volume. You want to have some interesting camera motion, you want to do something that’s forward and a much more interactive from the perspective of the talent that presenters in the space than your traditional new studio might. Okay. Well, yes, we can absolutely help you that. And in that, yes, you’re definitely seeing a transition for people looking at our robots as maybe a gimmick to a solution. As I said before, you’ve got this idea that a robot is 100% automated.
Paddy: You tell a robot to do something, it will do it again and again and again and again and again till we tell it to stop. And we try to create special effects, that’s absolutely critical. But also we find broadcasters want the same level of reliability. They want to have a particular intro and extra shot, or a particular transition shot from one section of the show to another, and they want it to be the same. And what we’re finding now is broadcasters are looking to do that cross geography. So on a simple level they might have multiple studios in a news or sports environment and want to go from one regional studio to another and a handover maybe from the national news to the local news, and they want that to be a cohesive sequence. So the sequence starts in one robot and then completes on another. Or maybe it’s an opposite, maybe one robot is moving in one direction and then another one starts in the other as you do the transition with the two locations.
Paddy: But we’re also seeing customers going a step further and saying, actually we want to have complete virtual world. So we want to have two people in two different locations appear as if they’re in the same location. And we can set the robots up so they’re aware of each other. They can be mapped in virtual space as well as in the real space. They could be linked to a disguise or MOSA server or whatever the chosen solution is for the virtual LED world that they’re going to be physically within. And you can have those two systems work perfect and parallel. Need a licensing connection of course, between those locations. But essentially it’s entirely seamless. You can have two people in the exact same sets but in two completely different geographies, and you’d be none wiser.
Dak: On the more traditional broadcast site as well, you’ve seen your technology starting to be adopted, whether it’s in Germany, ZDF, or in the new Al Jazeera Arabic studios, where you have the studio bots powering kind of their digital production and doing a lot of those fancy camera moves for content that maybe hasn’t traditionally seen it.
Paddy: Yeah. Absolutely. So ZDF is a really interesting one because ZDF actually is a more traditional motion control robot. When you look at it in the context of most people would consider to be broadcasts camera movement, which might be a track system, might be jib, might be a crane, you can’t really explain it in those contexts. It doesn’t work that way. But what it provides is a very stable pivotal movements on a short section of track. I mean, it could be a longer section of track. It doesn’t have any lateral movement in itself. And use the camera zoom and that track to create that type of movement. But what it gives you is something that’s incredibly smooth and stable. And anything else that exists out there won’t come close to the capabilities of this sort of system. And in turn what it also gives you is the ability to be incredibly accurate with that smoothness.
Paddy: So you can put a fairly significant payload, you can put a fairly large camera lens on the end of it. You can create a track between different sections of the studio. So it can be to how, or simply be used on one side of the track for one app activity, another side of the track for another. But it’s creating motion that you wouldn’t be able to achieve any other way. Al Jazeera, that is a more traditional style arm. So it is an arm that’s specific for broadcast and what we call the StudioBot XL. It has a specific modification with the final pan and tilt axis is separated away from the arm. So one of the complexities of the arm is you have these axis, the software that we use to control them basically allows the user just to track the subject.
Paddy: So you use the joystick to follow somebody. We could automate it, we can use presets, of course. But if you decide to joystick and follow the subject, the software determines which axis best to move, to create the movement you’re trying to achieve to track your subject. In the case of this StudioBot XL it has this final axis, which is just a simple pan tilt. There’s a slung from the majority of the robot, the main arm of the robot. And the logic there is, it gives you finer control just of those individual elements. It can be used to conjunction with the automated tracking software, Polymotion Chat, for example. But additionally, it makes things smoother and quieter. So if you’re doing simple moves, you’re able to do that without, robots are never perfectly quiet, but they are surprisingly quiet.
Paddy: But with that final axis, you could just move an axis, you don’t hear anything. And in turn it also relatively demands prompter. Because one of the other challenges with an arm like is the prompter. If you are looking and making the rebel pan down, there’s a chance that the prompter would hit the arm. So by hanging the camera mount underneath the arm, you don’t have that issue. So these are things that we developed over time specifically for these broadcast scenarios where we understand customers have problems that they need solutions to. The other really interesting thing was something like the StudioBot XL is though in a broadcast environment it’s typically designed to be quiet and smooth and generally quite slow, we have customers that also sort of use them in a hybrid way. So the majority of the time they’re doing just that, which is creating some nice gentle movement, they cutting between point in the studio, moving between a transition area or something like that, sort of marque points about the broadcast. But there’s nothing to stop them being used in more traditional motion control.
Paddy: So we’ve got a major broadcast in the US that uses them for things like width pans. So they’re doing stings and other sort of promos. Something that’s going to go through a post. It’s not something that you do live of the camera on the robot that you might be doing in your more traditional news broadcast workflow. But if your robots only being used for three hours a day, there’s a lot of time in the data, that robot could be used for something else. And that’s the sort of flexibility the assistants giving you, they can enable you to do some brilliant other smart production with them. And because it’s fully automated, once you’ve trained an automator, you can set the presets up. And then it’s just a case of launching into the sequence move. And it will create those shots for you again and again and again, making it a really sort of flexible solution for a whole range of different broadcast scenarios.
Dak: Is that one of the biggest challenges, helping them understand that kind of return on investment since there is obviously a bit more upfront that has to be spent to get these new uses they haven’t thought about out of it?
Paddy: Sometimes customers come already thinking that way saying, okay, actually one of the things that interests us about this is the ability to use it in other areas. I was talking to a customer last week that wants to use it for a sports production studio, but they also run a game show in the same studio. And they see some interesting areas there because they typically use a crane for the game show. And also the dramatic moments, they think the idea of being able to suddenly make the robot move extremely quickly to a point might be quite exciting and attention to drama. So there’s lots of things like that where the customer often has quite a clear idea of what they want to do, but they’re not sure how to do it. And then on the flip side, yes, you definitely get customers that come and say, well we want this.
Paddy: Sometimes it is a cost saving thing. It is a decision that’s being made around, we can’t afford to use the crane or the jib all day. We use it in our marque shows, we’re using it in our breakfast morning show, we’re using on our evening news show, we might use it in one of the current affairs pieces that we have at some point in over the weekend. But we aren’t using it in all the other news bulletins because we can’t afford to have a jib operator or a crane operator on shift. And so it’s not a case usually that the system is designed to replace people, it’s designed to augment it. It’s designed to enable the broadcast to do more. And that’s a regular conversation that we have. For sure, talking about pricing and things. They’re not as expensive really as people often think.
Paddy: And actually we’ve had a few scenarios with customers when the automation elements of it and the enablement that it gives them has made the robot more cost effective within even 12 months. Usually it’s a little bit longer than that, but it depends very much on the nature of the broadcast. And certainly if it was a 24/7 broadcast, then the cost savings and advantages of a robot and also things like the software automation systems, yeah, they’d pay themselves back within a few months. And we’re definitely finding customers now are much more aware of that, much more thoughtful about the investment that they’re making. There was a sort of knee joke at the beginning of COVID saying, they need to do remote production. I’m sure there was a huge number of PTZ cameras and pan tilts sold by most manufacturers at that point. But I think we’ve moved a bit beyond that now and looking at sustainable quality workflows for the future, and in particular really creative workflows. And yeah, it’s true robotics our robotic arm, they pay for themselves surprisingly quickly.
Dak: The problem you’d run into, at least here in the States, is most of these studio floors aren’t that level. So that presents some challenges for some of these systems. What other challenges do you see then for implementing in those more traditional broadcast sectors?
Paddy: Yeah. So it’s interesting. That has been a pain point many times in my life, not having level studio floors. Robotics arms actually, there’s too much of a problem with that. You can build compensations really easily. We’ve seen a bit of a shift now as well where broadcasters are moving to having full LED volumes, they might even have an LED floor. And that isn’t something that’s hugely compatible with other forms of robotics. So if you want a track robot, that’s a problem. You can’t run the track through the LED. If you’ve got a free rowing pedestal, will it be supported? Will it damage the LED floor that it’s been run on? So in that case, having a jib or crane or robotic arm, something that can move into that space is extremely, extremely valuable.
Paddy: But then there’s lots of issues that people need to consider. Safety is an important one. The robots are great, they’re fully automated. You tell them to go where they go, they’re always going to go there. But you can’t always predict what somebody else is. So we have a number of different safety tools built into the system, including laser scanners and the like, which will detect when people move within to the space. The biggest problem generally is expectation and perception. It’s understanding what the robot can do well and what may be a limitation for it. Given the space, lots of customers think that they can use maybe a StudioBot, the smaller version, in a really big space. But when they come to try and deploy, it simply not what they need. You were finding more and more customers now wanting to map on the seating.
Paddy: And that’s fine, we’ve done that before. And then they want to put it on a seating track and it’s, well, okay, yes, we can do that. That’s not an issue. But you are putting a few hundred kilos onto your ceiling. So you then got to account for that. So there’s incomplexity there in making sure customer understanding is where it needs to be. And then the challenges is really just come from users. One of the things I think that surprises people is how easy they are to operate. Everyone thinks you’ve got to potentially nine axis, and those at the time when you joystick the robot, the software will take account of that. If you want it to follow somebody and you had track, you would move the robot along the track, because that’s the obvious way of maintaining the smoothest framing that you’re going to get. And people think, well, surely I’ve got different joysticks for all the different axis. And well, you can do that, but actually there’s much smarter way. So usually the biggest challenge is getting over the perceptions of individuals, rather than anything that’s majorly technical.
Dak: And then in terms of the product roadmap or pipeline, are you all considering having a more traditional robotic pedestal that would compete with some of the others in the US market?
Paddy: Not at the moment. One of the things that we’re really keen on is delivering different workflows to other vendors, going about it a different way. I mean, when you get into it sometimes, and you could probably achieve 70, 75% of the shots that a robotic arm could achieve with a good enough floor, I mean, pedestal. Maybe not at the same speed, but roughly the same source of shots. And then if you then drew a then diagram and you put in track base robotics, you could again achieve a greater percentage still. And there will be the odd shot, for example, that they can achieve that maybe the robot arm can’t achieve. So when you look at it, you have those chance. From our perspective, the robotic arm, it enables you to do more than any other robotic solution can do in terms of beauty shots.
Paddy: Again, not necessarily with the same level of flexibility, having a pedestal that can go in any point of the studio, as long as your studio floor is accepting of it. But fundamentally, if you’re talking about something creative, then it’s combining it with those other robotics. And we have track based robotics. We have that floor track and receiving track solution, which you can put a very substantial payload on it. We have some really simple track based robotics designed for just a simple head like our AFC or a PTZ camera. A lot of customers using it for a PTZ. And in fact, that system you can have it, the whole track on a descender and ascender. So initially we would have developed a system for a PTZ in a column just rise up and down or descend from the ceiling, for example. And then that was extended to actually moving the entire rail down. And all of the movement vertically and of the head of the PTZ camera on the track is all on air. So it’s extremely smooth.
Paddy: And actually you change the conversation from, forget the product, what’s the shot that you want? Here is the through the lens footage that’s achievable with this system. As opposed to, I’ve got this idea of this product that’s going to do X and Y. And we started getting into those sort of creative conversations, then you start to get a completely different perspective from the customer. And they forget about what it is they’re talking about and we start then looking at the sort of shots that they’re already achieving, what’s wrong with those shots or they want to get more from the studio. Maybe they want to do something different in that space, either in the particular show that they’re running, or they want to run different shows from that and the flexibility that the arm gives them. So for us at the moment, I think we are adding to the robotic arms that we have.
Paddy: So we have today the StudioBot XL, there will be another arm added to that range shortly. We’ve also just delivered a robotic jib to a customer, a customer that was initially approached us in sort of classic way saying, we’re very interested in a Bolt arm for a particular scenario. And actually when we broke it down with them, we could have sold them that. We could have sold them and charged them a lot more for it. That was what they requested. And sometimes you’ve got to think the customer’s always right. But actually in this case, it was a much better way of doing it. And we had a robotic jib solution as part of the portfolio that’s been developed. So we decided to deploy that in this case. So that’s something that’s not officially productized yet, but will be sort of very shortly.
Paddy: And in particular for us is, we also want to be open to on the software. We support pretty much all third party PTZ cameras. We support VISCA protocol over IP. We support NDIS control protocol. And we’re really open to trying to work with the other third party manufacturers for pan tilt heads and pedestal robotics and track robotic to integrate our system for control. Polymotion Chats tracking tools are now, we believe, far ahead of anybody else’s capability because we use this limb detection model. And because we already support a third party PTZ cameras, we think it makes sense to try and support all robotics and have a sort of open ecosystem and architecture. We are particularly protective on our side. We do find that other manufacturers are less open to that. But I think if we were to start looking at competing in the free growing pedestal market, then I think the discussions that we’re trying to have about engaging with other manufacturers about the openness of software tool.
Paddy: So essentially a vendor can work with other vendors. So when a broadcaster says, we want to combine your robotic arm with a free grow pedestal from another manufacturer, that can be achieved. We have that at Al Jazeera to an extent. There’s ROS robotics there and I think there’s also VINS robotics in other studios. In the particular studio with the robotic arm, they have the ROS pedestals. And we can be aware of those ROS pedestals. We can make our software and the robot know where the pedestal is by using the positional data coming from the pedestal. But in the moment, we’re not able to engage to control those. And I think that’s something we would like to see in the future is some more interactivity between, interoperability between vendors, have a bit more openness to enable that.
Dak: I was going to say, the big dream of the IP production world is that it will all be interoperable and everyone will be able to work together. However, with everyone moving to their silo cloud services, we haven’t quite gotten there yet. Even though as you move things to the cloud, it becomes much easier for them to work together and to be able to talk to each other. So then is that the biggest challenge you see on the software side, is just making these disparate pieces all come together, or is there some other problem?
Paddy: I mean, the generally you don’t find cloud architectures being used in the control of robotics normally. First, because the latency required. I mean, you tend to find this sort of the control of it its managed locally. When we do find, for example, with Polymotion Chats, you can have a pretty good hybrid set up where you might have somebody remotely control. And if you’ve got extreme low latency, that’s not issue. And we have an IP pan bar, for example, where customers are looking to deploy it from different geographies or controlling ahead in another complete location. But you must have extreme latency connection from point to point. Well, Polymotion Chat is sit in between and do the heavy lifting. So it will do the auto tracking and then you can manually make the changes. So you can create workflows like that.
Paddy: The problem generally is more one of just general openness in terms of protocols. So on the PTZ market, you will find that although everyone competes, they tend to use standard protocols. So VISCA and IP being the most common one. It’s not a great personal ideal protocol, it lacks some finesse in terms of what we want to be able to do as a robotic manufacturing. That’s not just Mark Roberts, I’m sure other manufacturers like VINS and ROS and Shotoku would say the same thing. But fundamentally what we find from those vendors is they’re not interested in providing access to their units for control. Probably for commercial reasons, maybe for technical reasons, I couldn’t actually speak to that. But that’s where you struggle. And there aren’t really any sort of open standards in that regard. Over the time people have moved a bit away from best of breed where you’re buying one software solution for one part, another hardware solution from another software solution and consolidated into sort of end-to-end workplace from one vendor. And that makes a lot of sense, certainly in the idea of having sort of one throat to choke or something.
Paddy: But I think there is a kind of desire and somewhat of a move to having a little bit of flexibility. Accepting that there are things that one vendor will do better than another, and therefore being able to combine systems. So I think it’s improved. And I think the more customers that want to do something different with robotics, then the more conversations that we have and the more likely we get engagement. But with IP, this should be the perfect time, just as you were saying, we should be able to make… The connectors are the same, the protocol is the same, fundamentally in terms of the way the data is moved around and obviously the actual protocol in terms of how the devices communicate is different. But we should be able to just make that small step for interoperability, which of course would benefit the end customer. But it’s really just down to willingness. I think it’s nothing technical that’s stopping that. It’s just people saying, you know what, I’m prepared to give this a go and try it.
Dak: It seems like on the Hollywood side they are quickly coalescing around some standards as there are more LED volumes being built, where they’re quickly establishing best practices, trade groups, all kinds of the normal procedures you would see to make sure that all of these, whether you’re producing something in Canada or in Hollywood or wherever, it’s all going to be the same tools. That’s the same way of thinking about it.
Paddy: Yeah, I think that’s a really good civilization. And it is led by the customers. I think this is a big difference between Hollywood cinema and the like and broadcasts. Broadcast tends to be really isolated, there isn’t that sort of sense of people moving. A DOP will be working for one studio with one director in one location one month for one quarter, and then they’ll be working on a completely different setup, probably with similar tools and none with the same people, but they’ll be trying to solve different problems. And other people will come into that conversation. And therefore there’s a lot more engagement about best way to solve a problem or do something new or change or revolutionize or just simply evolve what people are trying to do. And that doesn’t happen as a conversation much in broadcasting.
Paddy: On one hand that makes sense because broadcast is much more about consistent reliability and performance. But I think sometimes we lose out a little bit by not being as engaging on that. But at the same time, we’re quite lucky. We do have customers come to us a bit like as I was describing with the sort of random ideas of saying, we want copy what the Mandalorian did and we’ve got this mad idea of having two robotic arms at different locations with the same content in both and making sure the power axis and everything is correct between the two, the different camera sources. And so well, we can do that. Great that you are having the vision to think about it. But then again, in that situation we can deliver that workflow for a customer, we can deliver it across multiple heads and multiple cameras, but they would have to be our, we’d have to own the entirety of the ecosystem.
Paddy: We obviously talked to the system that’s powering the display. But if somebody said, actually, I really prefer this robotic head from another manufacturer, you guys have got the workflow, you’ve got the tracking capabilities, you’ve got the software, the robotic arm, but there’s one little piece of the puzzle, we want to use this product because we like it and we think it’s better than yours for this particular application. You’ve got a better payload, for example. And we fully hold our hands up and accept that. It’s always going to be the case. People have got better products all the time. Suddenly the whole workflow falls apart. And that has to either be an entirely a silo in itself or you have to say, look, you’re going to have to make your compromise. And it’s that compromise which Hollywood isn’t prepared to make typically, but a broadcaster has to on a regular basis.
Dak: So in terms of the installations you’ve seen with your products, which ones are the most creative and get you the most excited about what’s possible?
Paddy: I mean, ZDF that you brought up is a really interesting one because they’re just doing something that nobody else has ever done before. And anyone who’s watching this, ZDF did a really good 360 of their studio, will just give you an amazing overview of what they’ve managed to achieve, in not a huge space as well. I mean, that’s one of the other things. And so that’s one that’s really exciting. We’ve got a couple of other very interesting installations that are taking place at the moment in North America, which I’m unfortunately unable to talk about specifically. But maybe with the timing of this interview coming out, I’m sure there’ll be some PR on our website about it. Which again are just doing almost crazy workflows in terms of they’re wanting to do with the space that don’t, and the idea of the sort of movement that they want to achieve.
Paddy: And in particular, I think for me what I love is, customers coming and bringing us these crazy ideas. We had a conversation with a customer, and they want to fly into a model with a robotic arm. And then that shot becomes virtual in a 3D model they created of the model and then back into the actual studio. And at that point, look, the camera would’ve flown past the model and mixing those sort of shots together in a probably less complicated sort of combination of AR. So you could imagine. And it’s just great to have customers saying, this is our idea, what do you think? And I think it’s crazy, but we could probably do it. That, again, coming back to the sort of Hollywood example that you were giving, that’s what excites me.
Paddy: And in particularly, some of the things that are going on, I probably touched something like the sports automation that we’ve been doing with the Bundes League or in some other sports that we’re working on that, things that we’re doing there that people just don’t believe is possible. The automated camera tracking where you can select a player and the camera will just follow that player around the pitch during the match, either for analysis or for, well, halftime conversations about the performance of that particular player. Sometimes it’s wanted for extra feeds for brook. So if you had a player from the US who was playing in the Bundes League, ESPN might be interested in taking a direct feed of that player for the entirety of the match, for example. You’re opening different workflows with these solutions that weren’t possible before.
Paddy: The thing we’ve had regularly fed back to us is, it’s not about reducing costs, not about reducing personality, your camera solutions are doing something a human physically can’t. You can’t have that reliable track. You can’t have a robotic arm or you can’t, sorry, kind of a jib that’s always going to give you that exact shot 100% of the time, which the robotic arm essentially will be able to do. And what I find the most exciting, is being able to deliver those workflows for customers, just in general, nevermind to a specific customer,
Dak: As we continue to go multi-platform, we’re going to start seeing more personalization of feeds, personalization of how people interact with content. And that’s where then that tracking and those kind of technologies become very important. Because that way you can say, I want to just watch this one NBA player for the game. That I just want to see how he does it. Where do we go from here?
Paddy: Yeah. I mean, mostly evolution and addition, I think. I mean, I touched on the fact that we’ll have another robotic arm coming into the broadcast range in the next two months probably. And we have an automated jib solution, again, which we’ve now delivered the first unit. Well, we have deliveries to make of that product. So I think that’s something else that will continue to grow for us as a business. We have another robotic head solution that’s coming to market shortly as well. So there’s some evolution like that, really adding to the product range that we have. On the sports automation side, you mentioned tracking your player, that’s something that we’ve seen more and more. We have a customer we’re about to launch a proof of concept with for something we affectionately call coach cam, it’s not its product name at the moment, but we need to establish what that will be.
Paddy: But essentially it’s using our tracks, so Polymotion Chat and its simplest form skeletal model of your subject. So it doesn’t care who the person is, male or female, doesn’t matter about the ethnicity. It’s basically detecting the limbs, it detects the head, the shoulders, the hips, the pelvis, and it uses that to track the subject. Uses the face to frame, because that’s usually quite important for people when they’re framing a shot, but fundamentally it’s the limbs that matter. And the next version, in fact the version has just gone into beta testing that version. Also has facial recognition in it. So we can teach the software what a person looks like, either directly in front of the camera or we can load photos, and then you can set profiles and rules for that person.
Paddy: So you can basically say, when this person appears on a particular camera, so you can have up to 10 cameras currently tracking different subjects, if you want. You might have a camera sitting there doing nothing. When somebody walks in and they recognize, and you can have a rule that says, right, follow that person. So that’s popular things like AV and events, TED Talk type things, and your guest presenter comes on and it starts tracking them automatically because that knows who that is. No one needs to touch a button to make that happen. But particularly in sports automation, this proof of content we’re about to do is to have two cameras on the opposite side of a football pitch tracking the coaches. So said the effect recorded coach camera. So the idea is that if there’s a controversial incident on the pitch, then you’re able to have a split screen of the output from these cameras with a little picture-in-picture of the action.
Paddy: So let’s say, whether it’s a penalty, not someone’s been taken out on the edge of the box or maybe there’s an infringement and whether it’s going to be a penalty in basketball or did somebody actually put their foot out to bounce, that kind of thing. Essentially at that point you’re then seeing the reaction. And that’s, you’re talking about the personalization, that’s something we are seeing more and more. So these feeds could be used in broadcasters, it’s obviously their reason. But more and more people looking for interactive content for the app that they’re streaming this content to or through a web platform. So essentially an individual can choose what view they see. I mean, I was prove to the demonstration by one of the companies that we’re talking to about providing this as an extra sort of camera positions.
Paddy: And they see a world where a user has, would essentially… It’s like a multiviewer. The output of a switcher where they’ve got one big video window of the main action, the broadcast feed, which is whatever the TD is decided is the feed that should be on air at that point. But then around that they’ll have their other feeds. And if they’ve got favorite player being tracked, they’ll have that. If they want to see the output of their coach, they can see the output of their coach. If they want to have a wide shot of the opposition goal, they can have that. And that’s the way it’s going. It is that personalization of content. And that’s where the camera automation systems come in, because it enables you to have 12, 15 cameras on site to where you might have only had seven or eight previously.
Paddy: Or if you’re a sort of top tier sport by the NFL or the NBA or the Premier League or the DFL where you might have 20 or 30 cameras, you might want 40 or 50. And you’ve got the issues with that of personnel of where you locate those cameras, how you man them, how you run them. And obviously by having that automation solution, you don’t need that. That’s covered by either software. And that’s what we’re seeing in particular. So I think it’ll be more of an evolution. Polymotion Chat, for example, has been around for over three years, almost maybe four years now. It’s not a brand new solution that’s not battle tested. There are plenty of broadcasters out they’re using it today. But I think what we’ll see is much more interesting applications for it for customers.
Paddy: I feel like they come to us with a robotic arm and say, we want to do this. They’ll come and say, we want…. We had an example recently, we can locate different parts of the body relative to the HD 1920×1080 or 4K image, a graphics manufacturer came to us and said, right, we want to be able to know exactly where this part of the body is because we need to put a graphic on that part of the body and have that graphic follow that part of the body constantly. Polymotion Chat absolutely gives us that ability to do it. So there’s a lots of evolution coming out like that, as well as some step change improvements and things like the track based robots and the other things that I’ve already talked about.
Dak: Well, Paddy, thanks for filling us in on where you all are headed and what’s up. It sounds like Mark Roberts Motion Control continues to be at the center of both broadcast and cinema. And that can lead to a lot of great sharing of ideas and workflows that can maybe make broadcast sometimes a little bit more exciting.
Paddy: Yep. Hopefully that’s all we’re trying to do. So yeah, thanks for your time. Really appreciate it. It’s been great to talk to you.
Dak: Thanks for listening to the Broadcast Exchange. Make sure to subscribe for the latest Broadcast Exchange episodes on your favorite podcast platform or watch our video episodes on YouTube.
Subscribe to NewscastStudio for the latest news, project case studies and product announcements in broadcast technology, creative design and engineering delivered to your inbox.