Pitches for the shortlisted projects on the IBC Accelerators Innovation Programme were presented in a quick-fire round in front of an international M&E audience at the Kickstart Day on 6 March.

The programme, which offers a year-round cycle of engagement, invited Champions and Participants of each of the 12 projects to present the common challenges they are addressing, what they hope to solve, and how their innovations will help the widespread industry now and in the future. The opportunity allowed each pitcher to share who they already have involved, who they want to collaborate with, and/or which specific vendors or sectors they want to reach.


The 2023 winning team

Commenting on the importance of IBC Accelerators programme, sponsor John Canning, Director of Developer Relations - Creators, at AMD, described one of the key strengthsas “doing things that live beyond the tradeshow.” AMD is returning as sponsor this year and will help support a whole load of new workflows as the systems evolve

The 2023 Project of the Year Award was also announced, going to the team behind Responsive Narrative Factory.

The opening keynote, which focused on news and journalism, emphasised that the programme is an opportunity for transformative technologies and multi-platform publishing - trying to get the best of both from traditional to digital production environments.

Judy Parnall, Head of Standards & Industry, BBC, said: “You need to be absolutely certain of (the accuracy of) what you’re putting out,” and using the tech available to do so efficiently to serve the audiences.

Claudia Milne, SVP, Standards and Practices, CBS News & Stations, said that the scale of deepfakes is “daunting”.

Parnall added: “It’s such a big challenge there just is not one way that will solve it. Let’s look at all the techniques, work together as much as we can.”

Returning Champion John Ellerton, Head of Futures at BT Media & Broadcast and Chair of SMPTE UK, said: “ Everything is in flux now to deliver the best quality content,” emphasising again the reliance the industry has on collaboration.

Keynotes were followed by a live demo from Ian Wagdin, Senior Technology Transfer Manager at BBC , showcasing the results of last year’s project: Connect and Produce Anywhere.

IBC365 asked a handful of randomly selected Participants and Champions about what they see on the horizon for the M&E community.

The pitches:

The 12 pitches were presented by project Champions – broadcasters, studios, platforms, and content providers – i.e. the end users or buyers of technology, and some with Participants. Champions pitched the business or technology challenges that they need to explore, better understand or solve together. Participants - vendors, manufacturers, developers, products, services and solutions providers – had the opportunity to network and join projects to help to design and explore new workflows and architectures to address the challenge with the guidance of the camps.


Mark Smith (far right) joined by (L-R) Claudia Milne (CBS News), Jon Roberts (ITN), Tony Guarino (Paramount Global/ CBS), and Judy Parnall (BBC)

Pitch 1:

Connecting Live Performances of the Future

Conceived by participant Andy Hook, Technical Solutions Director at White Light studios, and champion Malcolm Brew from the University of Strathclyde, with support from champions BBC and University of Strathclyde, the ‘Connecting Live Performances of the Future with Sub-100ms Synchronised Low-Latency AVLM’ project aims to push the boundaries of live, connected experiences by creating natural, believable immersive experiences at two or more connected venues.

Modern live performances often include dynamic visual canvases powered by real-time engines, spatial/object-based audio, and complex lighting and tracking data. The project aims to develop a solution for delivering all data between venues - in sync and with ultra-low latency. This will enable synchronised and/or distributed performances between multiple locations across the world connected by public internet.

At IBC in September, the project will culminate in an experience that immerses the audience in a live performance that is happening elsewhere in the world – with synchronised audio, video, light and media. The potential role of private 5G networks for capturing and distributing data will also be explored.

Pitch 2:

Solving the IP Identity Crisis

The move to IP infrastructure within broadcast facilities created challenges around the orchestration of media flows. The NMOS suite of specifications, created by the Advanced Media Workflow Association (AMWA), defines a common approach to device control and connection, and the recently created NMOS Resource Labelling specifications such as IS-13 address the challenge of how to find the correct device, sender or receiver when there might be thousands in a facility. 

Proposed by Polly Hickling, Learning and Development Lead for Media at Atos, and Peter Brightwell, Lead R&D Engineer, BBC, with support from champions BBC, EBU, IMG, and Solent University, the ‘Solving the IP Identity Crisis’ proposal looks to gather data from across the industry to establish current obstacles within this field. 

Commenting on the project, Hickling explained: “It’s all about the issues and challenges we have with very complex IP systems and then actually finding equipment when we’re adding to the already complex system, so it’s about resource management and labelling.”

Utilising data gathered from the project, the group will also look to test and implement solutions incorporating IS-13 and dynamic routing for orchestrated infrastructure.

Pitch 3

ECOFLOW: Energy-Conserving Optimisation for Future-Ready, Low-Impact Online Workflows

The consumption and environmental impact of services is the goal of the proposed project by Kristan Bullett at Humans Not Robots, Francois Polarczyk at Accedo and Tim Davis at ITV. The pitchers highlighted the need for sustainability in streaming via eco-friendly workflow optimisation whilst providing a media-centric, unified measurement approach, with collaboration at its heart.

They noted the challenges of measuring environmental impact and stressed the importance of collaboration to accelerate industry-wide initiatives. Currently there is a limited understanding of energy footprint in IP distribution and an urgent need for improvement.

The project plans to equip the industry to measure and reduce environmental impact, encourage the adoption of sustainable practices, whilst producing education and feedback to stakeholders and utilising a low-impact workflow and without compromising user experience.

Tim Davies said: “We really want to drive this as a cross-industry initiative.”

Pitch 4

Generative AI in Action

Roberto Iacoviello of RAI pitched the next phase of the 2023 Accelerator Synthetic Humans for Entertainment and Accessibility project, exploring Generative AI tools in broadcast environment to produce animated and live-action TV series episodes.

The 2024 project aim is to achieve script development, visual storyboard generation, 3D visuals, audio and music, digital humans with animation production, and even set design and sound design, all with increased efficiency, automation and innovation throughout various workflows.

Iacoviello explained that this new technology will not only change how we produce content but what stories we tell and how we interact with our audience. He then shared a case study of an animated and live-action episode of Around the World in 80 days.

The challenge involves checking content authenticity, evaluating potential interoperability, whilst taking into account the cost and sustainability. According to Iacoviello, the M&E industry is currently trying to balance the pros and cons, “There is a lack of standards. It’s a very interesting period of time, and we don’t know what will happen but for sure generative AI will be the future technology.”

Pitch 5

Connect and Produce Anywhere, Phase II

Proposed by Champions John Ellerton at BT Media & Broadcast, with support from Champions BBC, SKY Sports, DAZN, TV2 and Vodafone, this year’s project follows the build and development of an all IP, Edge-first, multi-cloud, multi-software test bed environment in the CAPA Accelerator 2023. Now Ellerton addressed the question, after a year, to move from Lab PoC to a real-world deployment, “what do we need to do next?”


The pitches were followed by networking amongst the project tables

The project team is now positioned to implement and road-test the solution on some real-world live event production scenarios where there are varying degrees of available bandwidth. The project will look to push innovation through experimentation in the following areas: environmental monitoring & measurement; orchestration; deployment (including containerisation) observability; other transport elements and business case and licence options, among others.

Ellerton added that the project hopes to attract young diverse engineers and create one coherent workflow rather than multiple solutions and applications. It will also address the difficulty publishers face to engage and retain audiences.

Pitch 6

AAVA: AI Audience Validation Assistant

In an age where data is key, AAVA leads innovation by combining extensive audience data with advanced AI to blend and transform content creation, involving the audience in the development process. The project will explore solutions to turn passive viewers into active contributors through AI-driven personas that reflect society’s complexity.

Presented by Gianni Lieuw-A-Soe, Co-Founder and Managing Director at Zwart, the project looks to address the importance of creating digital twins that accurately resemble real people, trained on data that comes from real-life data, and calls on experts on applied artificial intelligence.

This unique approach accelerates content development, infusing it with genuine societal insights. Essentially, the POC results will showcase a vibrant, interactive platform where the public not only consumes media but also plays a role in its creation, heralding a new era of public-engaged media.

Lieuw-A-Soe called out for “the right GenAI tools to give our virtual audience a voice, fast and secure interface, virtual personas, create new content fast, scalable and at a much lower cost. Looking to set up a collaboration between public broadcasting association striving for more and better customised content.”

He later echoed to 365 about “digitalisation of media consumption and interaction with target audiences,” as the main thing on the horizon for the M&E industry.

Pitch 7

Evolution of the Control Room

Building upon the work of last year’s Gallery Agnostic Media Production IBC Accelerator, which harnessed the potential of a single unified workflow across all devices and tech stacks, the ‘Evolution of the Control Room: Leveraging XR, Voice and AI for Live Media Production’ project aims to optimise the workflow using XR technology and AI solutions, enabling the production team to realise their creative vision without advanced technical expertise.

The POC will aim to showcase Democratize Media Production using Automation, Generative AI, Voice Commands, Distributed Studios and shared resources across regional and local hubs, using any device in any location to meet audience needs, with easier, faster, cheaper and more sustainable solutions.

The project was proposed by champions Grace Dinan from the Horizon Europe TRANSMIXR Project and Jon Roberts, Director of Technology, Production & Innovation at ITN, with support from champions TUS, TV2, TG4, HSLU, TCD, and the University of Strathclyde.

Commenting on the project, Dinan said: “The control room had pretty much stayed the same up until 2020 when we were forced into cloud-based productions and decentralised workflows and we very quickly had to adapt the roles and the way that we produce content under pressure. Now that the dust has settled we want to take time to evaluate those solutions, to look at how we can progress even further.”

Roberts added: “Our idea is to look at the live production technology stack and see how a number of emerging technologies, notably XR, voice control, and GenAI, might allow us to reimagine the way that live production teams control their technology.”

Pitch 8

HTML-Based Graphics for Multi-Platform Production

Champions Niels Borg, Head of Graphics at TV 2 Danmark and Ryan McKenna Executive Product Manager for graphics and automation at the BBC are looking to develop a modular end-to-end HTML-based graphics workflow that supports multi-platform delivery.

As Borg explained: “We are looking to transition from being a broadcaster in a traditional way, where you deliver linear content, into a more multi-platform environment, where you deliver to multiple platforms. Therefore brand consistency is important and you cannot do that without having HTML graphics, it’s the only way you can achieve real multi-platform brand consistency.”

The project aims to develop a modular graphics solution that supports multi-platform delivery as well as real-time end-device rendering and playback. It aims to use industry-standard graphics and programming tools for graphics development along with common off-the-shelf web components for storage and visualization.

Pitch 9

Changing the Game: Predictive Generative AI

Content providers are beginning to tailor their content to create personalized viewing or interactive experiences for their audiences.

But is this enough to retain the modern viewer? Led by champion ErinRose Widner, Global Head of Business Strategy, Emerging and Creative Technologies at Verizon, the project aims to adopt a multi-modal approach to leverage private networks (wired and wireless) and develop a predictive algorithm based on ethically sourced datasets to create a unique authentic personalized live viewing experience.

Leveraging LLMs, tailored datasets and computer vision housed in a Private 5G network, this project will look to generate predictive video and audio that is authentic and as close to what actually happens during a live-action event such as sports or live broadcasts as possible.

Pitch 10

Design you weapon in the fight against disinformation

The growing threat of misinformation and deepfakes in the media is the focus of this project which aims to develop an industry wide understanding on the challenges and abuses being faced today by all media outlets.

Judy Parnall (BBC), Claudia Milne (CBS) & Tony Guarino (Paramount Global) pitched the initiative, to identify disinformation and help audiences identify trustworthy news and information.

As Parnall acknowledged: “We’ve seen enough of it already happening…it’s already harmful,” Milne added that “the speed of which this information can spread can be very damaging.”

The team presented a three-pronged approach to establish authenticity, verify content, and detect deep fakes, including partnering with technology and solution providers, “It’s more than technology….it’s about how do we bring them together?”

The proposal is to build support around how media companies work together to address the issues and the question of communicating this work to new audiences. Through the resources of the BBC, CBS/Paramount and other Co-Champions, the team will look to establish an initial premise on the most effective interventions and how news organisations can collaborate on making them work and understood.

Pitch 11

Scalable Ultra-Low Latency Streaming for Premium Sports

Champion Alex Giladi, Fellow, Advanced Technologies at Comcast will look to create a streaming solution for premium sports experiences that’s optimized for modern wireless networks. The project will look to achieve Twitter-equivalent latency (1-2 seconds) and a near-instant playback start using standard, existing HTTP streaming technical stack and infrastructure. 

The areas explored would be the use of low-latency encoding and segment-based ingest, the latest low-delay extensions to MPEG DASH, and possible uses of MV-HEVC video codec and the QUIC protocol. The project will be an end-to-end system comprising an encoder, a latency-optimized origin, and a mix of open-source and proprietary DASH players. 

It is envisioned that the project will serve as an end-to-end technology roadmap to broadcasters, pay TV operators, and streamers who wish to provide a high-fidelity premium sports experience optimized for customer networks, whilst maintaining the current adaptive streaming stack.

Pitch 12

Digital Replica Provenance Automation for AI Talent Workflow

Presented by Will Kreth at HAND ID, this project follows on from the Synthetic Humans workflow of 2023. Digital twin technology has the potential to revolutionise the entertainment industry by automating the talent identity supply chain and providing a unique, immutable, and interoperable ID. However, there are emotional challenges to consider, such as the need for a viable workflow for greenlighting digital replicas. Consent, control, credit, and compensation are crucial in this process, and it’s important to ensure that all parties involved are properly compensated and recognised for their work.

This Challenge addresses the evolving disruption within the media and entertainment industry in the authentication and management of talent in today’s digital landscape.

The POC will also aim to showcase how to minimise legal complexities and enhance transparency in talent utilization through streamlined collaboration and licensing across industry stakeholders, simplifying licensing agreements based on verified identities.

Next up: The final 8 projects will be announced by 29 March and will go on to create a POC to showcase at IBC2024 in Amsterdam.