From newsrooms to sports and TV production, AI is already being used to speed up and boost the creative process. But could it replace humans altogether?

Artificial intelligence brain and computer

Can AI replace humans forever? 

We like to think of TV as a creative industry but it is already highly automated. Once an idea is conceived, there are many strict, even formulaic, tasks from robotic camera moves to compliance checks that are better performed by machine. The clue is in the term ‘industry’.

Few corners of any industry will be left untouched by machine learning (ML) and artificial intelligence (AI) but in media the highly programmatic activities surrounding the recording, processing and transmitting of information make the appliance of cognitive science particularly suitable.

Inevitably this throws up practical questions about the extent to which machines can or should supplant human jobs and broader philosophical ones about the nature of creativity itself.

One argument is that as these technologies are introduced, creatives will be free to spend more of their time actually being creative. The other follows the trend to its logical conclusion: that eventually the autonomous creation of content will be indistinguishable from the real thing.

Newsrooms embrace AI

No part of the industry has adopted AI more than the newsroom. A recent report from Reuters revealed that almost three-quarters of news publishers surveyed were already using some kind of AI.

Algorithms are being used to optimise marketing, to automate fact-checking, and to speed up tagging and metadata.

News organisations know they have to do more with less; they have to find ways of making journalists more productive without leading to burn out. Intelligent automation is one way to square this circle.

As one example, the Press Association (PA) in the UK has been working with Urbs Media to deliver hundreds of semi-automated stories for local newspaper clients. A journalist would find a story using one or more publicly available datasets (e.g NHS and population data), then write a generic story that is versioned automatically by a computer to create bespoke versions for myriad local publications.

Another potential use of AI in news is for reporting of live events. Dreamwriter, an automated newswriting programme developed by Chinese tech giant Tencent, uses speech to text software to turn conference speeches into stories. It apparently churns out 2500 stories daily.

BBC AI robot

BBC Click: Testing robo-reporters 

Source: BBC

The BBC says it does not currently publish stories generated by ‘robo-journalists’. But the corporation’s News Labs research team has worked on tools to automate “the transcription of interviews and identification of unusual trends in public data,” according to the division’s editor Robert McKenzie.

Automated workflows are similarly being applied to assemble video-led news packages. RTVE, the national Spanish TV channel, is testing an AI programme from media asset management (MAM) vendor VSN which can assemble video clips from an archive search into a timeline within seconds.

Broadcast news production is a particular focus of MAM software companies who assert that artificial intelligence will augment rather than replace human editorial oversight.

“Most of the value lies in combining multiple AI engines with human intelligence to achieve things that couldn’t be done before,” says Dalet director of product strategy Kevin Savina.

In the newsroom, for example, cognitive services are being used to auto tag thousands of pieces of archive content. In order to create a richer data set for search, other AIs for facial and object recognition might also be applied. Yet another AI might then applied to this material to recommend results in a form – a timeline for example – that can be of most value to the story the journalist has at hand.

“Using AI means journalists can be more focussed on the creative editing decisions” - Toni Vilalta, VSN

“Even then you need the filter of the journalist’s editorial judgement,” says Savina. “Only when you have a combination of multiple AI tools plus interaction with staff selecting and deselecting content of interest, plus a continuous feedback loop, will the AI be trained.”

“Cataloguing and search are very systematic tasks,” agrees VSN product manager Toni Vilalta. “Using AI means journalists can be more focussed on the creative editing decisions.”

However, AI is often expensive to use and few media organisations are set up to use it. What’s more, AI outcomes can still not be fully trusted. But for companies thinking of adding an AI to their workforce, there is also the issue of change management to be considered.

“An analogy for putting in place an AI is that of hiring a new employee,” says Savina. “You need to teach it how to work in the organisation and colleagues need to learn to adapt to it accordingly.”

Sports production automation

Drone aerial shot of chelsea soccer stadium batcam

Stamford Bridge: Drone shot of Chelsea’s stadium

AI is also likely to have big future in sports production. Unmanned multi-camera systems covering an entire field of play have been on the market for years. Advanced auto-production algorithms, such as those used by Pixellot, track the flow of play, identify highlights, create replays and insert ads without human intervention.

However, Pixellot can now create highlight reels automatically, moving the system “one step closer to end-to-end television-like production” according to CEO Alon Werber.

Elsewhere in the industry, Tedial and TVU Networks among others are seeking to link the metadata captured live at a game with relevant content archives, using AI to automate the packaging of content into different lengths and for different platforms. Ultimately, such tools could be combined with information about the preferences and devices of individual viewers to auto-create and distribute personalised content – something which is not humanly possible today.

BBC match of the day

IPlayer: BBC’s Match of the Day

However, development is in its early days and a long way from being applied to flagship shows like Match of the Day.

“The quality drops off dramatically if there’s any kind of judgement [required of an AI],” says video search specialist Axle AI’s CEO Sam Bogoch. “If you’re prepared to bang out a thousand different versions out of the same event, then AI is probably more cost effective than a human. If you’re trying to do that at broadcast quality it would be foolhardy to think a machine will do any way near the job of a professional.”

“It could well be that the editors of the future are really managers of AI bots” -  Sam Bogoch, Axle AI

That’s for now, but inevitably as AIs learn from more and more data, it follows that more and more of the production will be automated.

“It could well be that the editors of the future are really managers of AI bots,” suggests Bogoch. “If broadcasting is fundamentally changing from producing one definitive edit for millions of viewers to narrowcasting where we’re slicing and dicing an infinite number of permutations, then AI has a huge role to play. In that scenario the role of the editor would be to shape narrative and let the bots do the work.”

BBC explores AI production

This is a scenario which BBC R&D is toying with. “We need to use AI to shape world-class public service, and we need to use world-class public service to shape AI,” stressed BBC Chief Technology and Product Officer Matthew Postgate recently.

In production, the BBC is exploring how good machines can be at choosing the right shots within an edit in direct comparison with a human.

“The goal is to automate more of the process to cover more content rather than reduce the need for human producers,” BBC R&D emphasises in a blog post.

It has developed a system that automatically frames and cuts coverage from high-resolution crops of UHD camera feeds to deliver an edited video package. The system can be manually tweaked to change the way it puts the content together – for instance by adjusting the frequency with which it cuts between different crops.

So positive have the tests been that the BBC thinks the prototype may already generate packages “close enough in quality to that produced by a human editor to be usable - at least for simple events”.

”AI liberates content we may not otherwise have come across in the BBC archive” - Cassian Harrison, BBC4

Beyond this, the BBC is exploring how AI can perform a draft assembly job for editing more complex recorded programming. It acknowledges craft editing as a “deeply creative role” but suggests an editor’s first task usually involves finding good shots from rushes.

“Sorting through assets to find good shots isn’t the best use of the editor’s time - or the fun, creative part of their job,” says the BBC. “AI could help automate this for them.”

A commercial application using speech to text in order to index rushes is already offered by Lumberjack. Its system has been used on Danish semi-scripted series Klassen where the shoot ratios are typically high.

“I see Lumberjack as more of a producer’s and assistant editor’s tool to take the grunt work out of production,” says inventor Phil Hodgetts. “When production time is so limited we need all the help we can get for creative work.”

The AI commissioner

Internet-first players like Netflix have embedded data mining into their business to use information about audience preferences to commission original programming.

3 netflix tablet copy

Customised reccomendations: Netflix employs AI for audience preferences

UK-based Epagogix helps Hollywood studios to identify “likely successes and probable turkeys” by running its algorithms over scripts. It says it helps risk assess potential investment “by delivering accurate predictive analysis of the Box Office value of individual film scripts.”

A related BBC experiment in this area explored how AI/ML could be used to mine the extensive BBC archive and to help schedule content targeted at consumers of high-brow channel BBC4.

“The AI liberates content we may not otherwise have come across in the BBC archive,” explained BBC4 editor Cassian Harrison, speaking at an RTS event in May. He also stressed [in a follow-up BBC blog post] that: “Humans have nuance, taste and judgement. These are qualities no algorithm or machine can replace. This is all about looking at whether technology can help them even further.”

“You’ll see lots of mass-produced content that’s extremely automated; higher level content will be less automated” - Yves Bergquist, Novamente 

Meanwhile, AI firm Novamente is creating a “knowledge engine” with the University of Southern California to analyse audience sentiment (a viewer’s emotional connection) across TV scripts.

“The aim is to link together scene-level attributes of narrative with character attributes and learn how they resonate or not with audiences,” explains Novamente CEO Yves Bergquist.

In the long term, he forecasts, “You’ll see lots of mass-produced content that’s extremely automated; higher level content will be less automated.”

AI with imagination

Netflix has also used AI to restore Orson Welles’ unfinished feature The Other Side of the Wind. Made in the 1970s but uncompleted and unreleased, the film was pieced together from rushes and scanned at 4K at Technicolor before being passed through an AI to improve the resolution of every pixel ahead of release this year.

“Everybody is scared of AI taking over the planet and destroying the industry but I look at this from a different perspective,” says Alex Zhukov, CTO of Video Gorillas, the company whose AI tools were used on the project.

“At the time Orson Welles was making that film you had to literally cut the negative to edit it. Now that job (neg cutter) doesn’t exist. [Likewise] all the intermediate jobs in content creation will go away but other jobs will arise to enable us to create content that we’ve not been able to before.

“AI can create art,” he insists. “The real question is when AI will create art that is indistinguishable from that of a human.”

Microsoft is part way there. It has developed an AI bot that can draw near pixel-perfect renditions of objects purely from a text description. Each image contains details that are absent from the text descriptions, indicating that this AI contains an artificial ‘imagination’.

“The model learns what humans call common sense from the training data, and it pulls on this learned notion to fill in details of images that are left to the imagination,” explains Microsoft researcher Xiadong He in a blogpost.

Microsoft envisions that, boosted by enough computer power, an AI like this will be able digitally animate a feature film from nothing other than a script.

Facebook is also teaching neural networks to automatically create images of objects, claiming that 40% of the time the results fool humans into believing they’re looking at a photograph of the real thing rather than a photoreal computer graphic.

Google is doing something similar, by teaching machines to look for familiar patterns in a photo, enhancing those patterns, and then repeating the process. The result is a kind of “machine-generated abstract art”, critiqued Wired.

The possibility of AI-powered music creation is also causing heads to turn. Facebook has devised an AI that is claimed to generate entire songs from little more than a hummed tune.

The Facebook Artificial Intelligence Research (FAIR) is part of a larger exploration by the AI community into unsupervised translation. Like Microsoft’s drawing bot, it’s a way of building in the sort of randomness or improvisation that inspires a lot of human art.

Facebook says its research “is a strong indicator of how AI could soon power human creativity”.

FAIR’s method still requires training to create different kinds of musical output - such as piano in the style of Beethoven. But the team intentionally distort the musical input by shifting it slightly out of tune to achieve a less supervised, more accidental, outcome.

“We are trying to build tools for helping anyone to create music,” says Ed Newton-Rex, founder and CEO at Soho-based outfit Jukedeck which offers royalty free machine written production music. “Our goal is not to create art.”

He says the Jukedeck platform, which can deliver a unique composition in as little as 20-seconds, is being increasingly used by video creators, brands and indie producers for social media, podcasts and games.

It was also used to compose the score for short film Zone Out, an experimental production written, performed, directed and edited by artificial intelligence in just 48 hours. The AI was fed on science fiction films to produce the script with footage from public domain films pieced together with green-screen footage of professional actors.

The line between science and art has already blurred.