Eight projects have been selected for the IBC Accelerator Media Innovation Programme for IBC2023.
The programme, which provides a framework for agile, fast-track and collaborative innovation in the media sector, takes on “bitesize” but complex industry challenges to develop and explore innovative, hands-on solutions that address common pain points. Proof of concept (POC) demonstrations will be presented live on the Innovation Stage at IBC2023, taking place at the RAI Amsterdam from 15-18 September.
The eight projects are:
- The Authenticated Data Standard aims to define a standardised data distribution package to ensure that, as programming is distributed and works its way through the entertainment landscape, content owners can maintain the original quality, messaging and intent. The project will allow them to verify and authenticate their content data and imagery and publish it to third-party sources. It will also build event-driven integrations that allow metadata to pass between systems.
- Synthetic Humans for the Metaverse employs a range of leading-edge technologies and archived materials intending to generate photorealistic avatars that can be integrated in a virtual production with real guests. To animate the avatars in real time, the project team will employ body motion, cloned voice or original audio while drawing on artificial intelligence, machine learning, and augmented and virtual reality. The project will also align with a second aim: to build a foundation for broadcasters to create ‘virtual translators’ using avatars and sign language for accessibility services and other functions
- Real-Time XR Sport Edge takes 5G XR to the edge to build on innovations in live motion capture and high-speed content delivery. The project aims to broadcast extended reality (XR) sports, including Mixed Martial Arts (MMA) and augmented reality (AR) techno-sports in an immersive environment with high-end photorealistic graphics, virtual advertising, spatial audio and social interactive audio – _delivering the experience to 3D worlds and metaverse audiences in real time, at the lowest possible latency via virtual reality (VR) headsets, computers, mobile devices and over-the-top (OTT) platforms.
- Connect & Produce Anywhere intends to build a distributed edge & cloud computing system to remotely produce a live sports event. By deploying 5G for connectivity and utilising software, the project aims to make the most efficient use of resources in bandwidth-constrained locations. The aim is to detach software from hardware and deploy a distributed computing architecture between ground and cloud, exploring the benefits, challenges and the sustainable potential of such an approach
- Responsive Narrative Factory plans to deliver the right narrative for any consumer in real time via a metadata-powered content fast-track. It will demonstrate a new component-based approach to quickly and cost-effectively creating multiple versions of content from a single master to enable precision targeting of programs to different demographics or regions or groups that can be monetised for premium Free Ad-Supported Streaming TV (FAST) advertisers.
- 5G Motion Capture for Live Performance & Animation explores using 5G to build ultra-low latency networks to support the creation of new immersive audience experiences for those present at a live performance or engaging remotely at another venue. Video ‘illusions’ will be piped from ‘anywhere’ to ‘everywhere’ over appropriate backhaul, using terrestrial or celestial Public Internet. The project also seeks to leverage 5G technology to bring joyful, interactive animated characters to children in hospital wards regardless of location.
- Gallery Agnostic Live Media Production aims to bring all media productions into the modern day via device-agnostic, gallery-agnostic and hybrid ways of working to prove control of both existing on-prem and cloud devices. Making shows should be gallery and device agnostic, so that the industry can adapt to current budgets, technical possibilities and a variety of circumstances like our venue or location.
- Real-Time Interactive Streaming Personalises Live Experiences project proposes to demonstrate that additional revenues and return on content and rights investments can be achieved through personalising viewer engagement from live interactive sports streaming, big or small. It will explore and create a next-generation sports viewing experience with real time interactivity, as well as new revenue streams for content providers that have acquired expensive premiere rights for entertainment events and sports and other live events, among other objectives for a POC.