Research over decades has unsuccessfully sought a way of enhancing the intelligibility of TV dialog - until now! In 2021, IBc Digital presented the results of trials by a collaboration of researchers using their deep-neural-network-based technology across a wide range of TV content and age groups. These show a startling performance; join us to judge the benefits for yourself!

We also hear about exciting research using cloud-based AI and 5G connectivity to deliver live immersive experiences to a variety of consumer devices. Key to the experience is the ability of viewers to change their content viewpoint, with live rendering taking place in the cloud. The presentation focuses on the audio which is object-based and AI-driven, and carries with it the metadata necessary for personalised rendering of the scene. The capture of the background is also critical to the recreation of the audio scene, for this the team chose second-order ambisonics accompanied by Serialised Audio Definition Model descriptive metadata. The presentation will explore detailed aspects of the audio processing and production. Altogether, a fascinating glimpse of the technology required to convey 360 audio for free-viewpoint XR!