Arrow International Media worked with GrayMeta and Limecraft to use AI to leverage unused footage during the Covid-19 lockdown. They spoke on a recent RTS panel about implementing AI.
When Coronavirus struck, production companies found themselves stuck with half-finished productions in need of edits or additional footage, but no way to shoot.
The challenges of lockdown to the industry are well-known and well covered, but for Arrow International Media, it also provided an opportunity to implement artificial intelligence tools that allowed it to leverage its back catalogue.
As Arrow production executive Carrie Pennifer explained on a recent RTS panel: “As an industry we are prepared for things to go wrong. But they tend to happen in isolation, not all at once, and to a particular production company or shoot. This was so big, I don’t think anyone could have predicted it or the effect it would have on our industry.
“We couldn’t put people on planes to go and film the things they need to film, but we were all sent home. Edits are split apart, so editors and producers need to find a way of working together editorially and remotely.”
The AI in Lockdown panel, which was hosted by Muki-International managing director and IBC consultant Muki Kulhan, featured speakers from Arrow International Media.
Arrow’s footage library contained more than 70,000 hours of unused footage, meaning the process of logging it all was no small task. Yet with a number of shows still locked in the editing process, the task became increasingly vital in order to continue productions during lockdown.
Post production manager Kyran Speirs explains: “We had many shows still in edits. Then we needed to put in place a workflow so we could carry on delivering these shows. We couldn’t film but we realised we had a huge existing library of footage from shows we’d already shot.
“We had all of our editors working away to storyboard what they needed and cut as much as they can, then work out what gaps they needed to fill. But it wasn’t just a case of plugging black holes – it was important that we kept up the standard of footage.”
The initial task of logging – before the introduction of an AI solution from GrayMeta – leveraged video workflow tools from Limecraft. Arrow already uploads most of its footage to the cloud using Limecraft, where it is translated into low resolution proxies, which limit metadata tags.
Runner India Gross was one of the Arrow team seconded into tagging thousands of hours of footage – no small task. She and her team of four began this process at the end of March and continued the process on Limecraft until the end of May.
Gross explains: “I was brought on to work among a team tagging footage on Limecraft with keywords describing the shot, whether it was day or night, whether it was GoPro footage, the location, and name anything relevant in the scenes.
“As you can imagine it was quite time consuming and laborious, but it was also quite rewarding. We’d deal with requests from directors and editors, and then search on Limecraft to find relevant footage, before sending it to them.
“Some requests were for generic shots, such as people walking, but some required lookalikes, which could be especially time consuming.”
At the end of May, Arrow announced that it had struck an agreement with intelligent metadata solutions company GrayMeta to use AI to “enable the company to log and access its catalog of unused content for the first time”, according to a press release.
The agreement saw Arrow leverage GrayMeta’s Curio platform to log 70,000 minutes of unused content from its archives, stored on Limecraft.
The firm had 140 people all remote working in the UK and US and 25 edit suites on the go, so needed to find a quicker solution for logging the unused footage.
Dan Carew-Jones, a post-production consultant who has worked with Arrow for a number of years, was involved in bringing GrayMeta on board.
He explains: “When we started we were using Limecraft to tag the footage, but it became apparent very quickly that the amount of footage we were loading into Limecraft was going to outpace our ability to tag it. We’d already been having discussions with GrayMeta about using their systems, and felt like this could be ideal circumstances to actually try it out.
“We didn’t go in with massively high expectations because it is not an on/off switch which will give you everything immediately, but it can assist in the process of finding the right footage.”
He said the results were almost instantly noticeable. The initial footage harvested through Curio was around 2,000 hours’ worth. The system tagged it within five days – something that was taking months to do be done by hand.
Arrow, adds Carew-Jones, was ideally placed to exploit this system because “their footage was available on the cloud.”
“Arrow has taken that first step towards using AI and machine learning, but they were at a fairly advanced starting point in terms of their library and the organisation of that library.”
“With our cloud system, everything we’ve shot is imported into the cloud,” adds Speirs. “On import it translates it into low-res proxies, so the metadata is very limited. If you have AI integrated into the camera, for example, that metadata can be very useful for identification.”
Given that AI can often be a scary new technology to use, was there any reluctance to use the GrayMeta solution? Carew-Jones says this was not the case because those using it had been the ones manually logging footage.
Gross agrees: “It worked well, because we’d been logging all of this footage so we knew the footage well, and then when we went to Curio, it became a lot easier to find requests, because we already knew what was in Limecraft. The two tools worked really well together.”
Footage on Limecraft or new rushes would go straight from drives or LTO, before being transcoded into Avid Media Composer, allowing editors easy access.
People and machines learning
The search function also benefitted the editors who were working remotely in the Avid. They would be able to identify a scene using Curio and capture everything shot around that scene. They could then put this into the Avid and, manually, identify a bunch of relevant rushes. This meant beyond just the archival database, there was also a series library of relevant rushes that had already been identified.
The database itself is also able to learn the more it is used. It takes one button push to delete a tag, meaning tags can be updated as footage is used. If a search brings up a clearly incorrect result, the tags can be edited to make sure it doesn’t happen again, meaning the search function will be smarter for the next user.
It also offers facial recognition tools which can also be taught by excluding certain terms or features, meaning the AI can develop smarter searches.
In terms of cost, Arrow wouldn’t share exact details, but Pennifer says it was a unique situation and, with broadcasters needing shows at a time when the industry was shut down, this gave Arrow an opportunity.
“It has to be on a case-by-case basis, but we were thrust into a situation where the clock was ticking and we needed to deliver more than ever,” she explains. “We were determined to do it, and had to weigh up the cost of not delivering shows against this.
“In the long-term, the question is do we gain or lose filming days, because we may have footage in the archives. But for us, we decided it was worth the investment.”