The film and TV industry is seeking ways of producing audio-visual media that combines the real world, CGI and 3D animation in ever increasing quality but at lower cost.
This paper describes a search for natural ways to make responsive media, investigating the techniques used in oral performances and contrasts these with ways which start from a fixed, recorded form.
In this paper Chouette Films explores the potential of virtual reality (VR) technology to be used in the preservation of cultural rituals and heritage for posterity.
In this paper, explore the challenges that have to be overcome to create colour-accurate immersive experiences at a cinematic quality level.
The capability of cameras and displays to reproduce small differences in luminance levels is constantly growing. However, we are still dealing with a limitation of the human visual system (HVS) known as the simultaneous contrast range (SCR).
In this paper, the author presents current work undertaken by the BBC to look at the representation of human skin tone in ITU-R BT.2100 High Dynamic Range (HDR) video systems.
This paper provides a technical/historical overview of the acquisition, coding and rendering technologies considered in the MPEG-I standardisation activities for immersive applications.
Virtual reality (VR) sickness seems one of the main limitations to the large-scale adoption of VR technologies. Thereby, it seems relevant to measure users’ physiological data in order to prevent and reduce VR sickness.
Volumetric video is regarded worldwide as the next important development step in media production. Especially in the context of rapidly evolving Virtual and Augmented Reality markets, volumetric video is becoming a key technology.
The benefits to using audio as a primary method of delivering new experiences, particularly during a sports event, it is the audio that best engages consumers with the atmosphere of the stadium.
The National Hot Rod Association (NHRA) approached Dolby Laboratories in early 2017 to explore what value immersive audio might bring to the NHRA’s television broadcasts.
NHK has developed a means of automatically generating auxiliary audio descriptions from metadata for use in live TV sports programs.
Case study: The BBC has over forty VR, 360 video and immersive audio experiments, that teams across the corporation have developed, covering a broad range of topics.
The advent of high-resolution head-mounted displays (HMDs) enables new applications for virtual and augmented reality.
Forensic watermarking is a process of embedding a unique identifier for each streaming video session into the video content.
IP networks in video production environments opens up the application of achitectural patterns from enterprise IT to the domain of broadcast and media production.
Moving to use more ‘IP’ in broadcast through innovative and practical methods rather than simply replacing the existing SDI connections with IP ones.
Creating tools and workflows for object-based media production
360 degree film making necessitates a new language for storytelling, the case study is based on material from two user studies on a 360 video profile of an artist.
The ORPHEUS research project aims to invent new workflows for producing, broadcasting and playing back object-oriented audio content.
Sports broadcasters benefit from a more adaptive scenebased audio capture and rendering, commonly referred to as Next Generation Audio (NGA).
Object-based media is a revolutionary approach for creating and deploying interactive, personalised, scalable and immersive content.
Since the beginning of TV production, channel based audio mixing has been the norm.
In today’s TV production environment, audio can take two forms: embedded audio and independent audio.
Sports as a content genre can be characterized using certain keywords – live, unscripted drama, conversational, loyal audiences
In recent years, patterns of media consumption have been changing rapidly.
Profound change to the way media is produced, distributed and consumed is upon us.
This paper gives an overview of the recent experimental services proposed by a group of European Broadcasters exploring the potentialities of a hybrid approach for audio in radio.
Virtual Reality (VR), Video, and Video games are converging. Movies and games are getting closer, sharing techniques and contents.
Next-generation video technologies include a variety of features – 4K resolution, high frame rates, wide color gamut, and high dynamic range.
Sport coverage on radio is popular, but has specific requirements: people do other things while listening to radio, so any enhancements to the audio experience must respect this.
It is only through the use of screen technologies recently developed for smartphones that VR headsets have become a potential mainstream consumer product.
For the past three decades, FM broadcasters have been engaged in what have become known as the “loudness wars”,
The MPEG-DASH ecosystem is growing quickly, and a significant portion of the content being prepared and delivered is protected content.
How AES-67, the new audio-over-IP standard, will bring the convergence of telecommunications, studio audio, and intercom
At the present time, audio technology is leveraging audio over IP technology at the basic network transport level, but is not taking advantage of all the benefits that are possible.
A lot has been written about the techniques and methods employed in creating a stereo mix or mix for music.
The plethora of digital platforms makes information available in a great number of languages, and the expectation of audiences to be able to consume media in their own languages is growing.
In 2014 BBC R&D presented an IBC paper on object-based broadcasting , the representation of media content by a set of individual assets together with metadata describing their relationships and associations, and the abilities to bring these back together again to make new content experiences.
VR represents an entirely new way for consumers to experience video. No longer is the TV viewer or game player a passive participant in the action; VR video simulates the experience of entering the video content itself, with the ability to see a full 360 degrees in any direction.
The LiveIP project is a practical exploration of the possibilities and opportunities achievable with today’s IP enabled broadcast technology in a live production environment.
The move to all-IP production has generated a lot of interest from broadcasters as a means of both saving costs and enabling innovation in production methods.
With the advent of distributed systems where every node has a sufficient amount of local resources to perform given tasks partly independently from all other units, reliable and timely data communication became a mandatory requirement together with the need for a common notion of time or at least a method ...
360° video is a special case of virtual reality (VR) in which the audience views a sphere (or near-sphere) of video centred on a single position.
The Centro Televisivo Vaticano (CTV) is the TV broadcasting centre of the Secretariat of Communication of the Holy See and its main goal is to contribute to spreading the universal message of the Pope through the production, distribution and archiving of TV material covering the activities of the Pope and ...
The value of metadata is now completely accepted for archiving and in many areas of production, particularly news, where it is crucial to enabling people to search, find and manage content quickly and easily.