The BBC’s research arm has been investigating ways to cover live events differently, employing cheaper workflows and streamlining technologies. BBC R&D’s Ian Wagdin tells IBC365’s Alana Foster all about the Nearly Live Production project.

BBCs Ian wagdin in production

Nearly live: Ian Wagdin operates the production tool at Edinburgh Fringe Festival 2015

The aim of the BBC R&D’s Nearly Live Production project is to understand whether technologies used for distribution and playback of media over the internet can also be used at the authoring stage to streamline workflows and give producers an accurate sense of what the final viewing experience will be.

In 2014, BBC R&D started looking into how low-cost production workflows could increase live event coverage at the expense of broadcast latency and quality. It recognised that professional, live multi-camera coverage is not ideal or practical for all events and venues, particularly at large festivals. A shift in working was proposed that would allow technical and craft resources to be deployed more widely and enable the broadcaster to offer greater coverage of live events.

The 2015 Edinburgh Fringe Festival was used as a case study. Using a prototype production tool, codenamed Primer, a series of three unmanned static UHD cameras and two unmanned static HD GoPro cameras were placed around the circumference of the BBC venue. A lightweight video capture rig delivered images to a cloud system. Software was developed to allow a director to crop and cut between these shots over the web and produce good quality coverage ‘nearly live’ at reduced cost.

Under Primer, the traditional roles of director, editor, vision-mixer and camera operator are combined into a single role offering cost savings and efficiencies. The operator can perform camera cuts, reframe existing cameras and edit shot decisions to deliver an ‘almost live’ broadcast.

The production tool is inherently object-based; it preserves all original camera feeds and records all edit decisions. This permits the BBC to create new types of multi-screen, multi-feed experiences for audiences.

Prime mover
Ian Wagdin, senior technology transfer manager, BBC R&D, is responsible for taking BBC technologies from the lab to businesses was able to tell IBC365 more about the project and how it has evolved. We covered the importance of new production workflows using IP and cloud technologies, wider issues on latency and synchronisation, and what is coming next.

IBC365: What are the BBC R&D’s Nearly Live Production project goals?
Ian Wagdin: We set out on the Nearly Live project to see how we could cover events differently. A big motivation for this, which has carried on into more recent projects like AI in Media Production, was the abundance of live performances and other events that the industry doesn’t have the capacity to cover. We were very aware of this when working at festivals, with numerous stages and venues; only a small number of them were able to be covered by traditional methods.

We noticed a move towards covering things in new ways via social media and online, where it was not always cost-effective or viable to send large crews and infrastructure. It was also conceived that many events do not need to be truly ‘live’ but perhaps covered within a more relaxed timescale.

In simple terms, we used several fixed high-resolution cameras to capture video feeds and then identified regions of interest to crop to delivery resolutions. These recordings were then cached and played out when needed. By doing this, we could create a non-destructive output with wide shots and close-ups generated from a single camera angle.

“We used several fixed high-resolution cameras to capture video feeds and then identified regions of interest to crop to delivery resolutions”

IBC365: What technologies were core to the success of this project?
Ian Wagdin: On Primer we used a lot of different technologies for various aspects of the trails. At the core were three or four 4K-capable cameras that were captured using our IP Studio technologies. These then enabled us to use software-based tools to create content. This also led to us working with SMEs to developed web-based tools.

IBC365: How does the edit process work?
Ian Wagdin: The edit is a mix between simple vision mixing (switching) and then a correction of this mix if time allows. One feature of the system is the ability to reframe shots or, for example, alter their duration if you were ahead of the play line.

It should be remembered that the aim of the project was not to build a fully capable system, but to enable the basics in such a way that anyone with a laptop could connect to the media feed and cut between sources. In turn, this lowers the barrier of entry to multi-camera live video production.

IBC365: How has Primer evolved over the years?
Ian Wagdin: The project was an early precursor to a number of current projects. We are using the output to inform current work, such as camera framing in AI in Media Production, as well as looking at the connectivity elements which inform our 5G in production work.

The concept of ‘nearly live’ also has impact on cloud-based production and lightweight live type workflows. 

IBC365: How important are new production workflows using IP and cloud technologies? How do the production values differ from traditional workflows?
Ian Wagdin: They are very important. We will migrate to IP over time and the benefits of cloud-based and remote working enable us to scale to meet demand. This will play a part in how and what we cover, as it is simply not viable for us to cover the range of events that we would like to. The benefits of this technology allow potential coverage of any number of events – from council meetings to sport.

Production values are set by the editorial vision, as well as the skill and experience of the storyteller. As technology evolves, it enables them to tell more stories in different ways to a wider audience. It was therefore important that we used user-centric design in developing our prototypes.

Skilled professional craft is crucial in good-quality coverage, but it’s also a limited resource. One of our research goals was to amplify the effectiveness and confidence of people with less experience, for example by experimenting with features such as being able to go back and alter cuts. This doesn’t reduce the value of experienced professionals but might allow lower-profile performances to be covered at reasonable quality.

The editorial vision and budget may vary by either the delivery platform or event itself, but the aim is to enable more people to access live video as a tool.

IBC365: What are the issues with latency and synchronisation for live IP productions?
Ian Wagdin: One of the key drivers for this project was to remove latency from the equation when looking at live remote production. Latency has always been an issue for these types of workflows, so we set out to approach it in a different way. This meant capturing the live video streams and then applying software tools to create the finished product.

If you are happy that a concert, council meeting or a comedy show goes out half an hour after the event then this is not a problem. Of course, one major issue was not being able to ‘direct’ camera operators. But, by using a single operator to follow the ‘action’ and combining this with fixed cameras, we were able to widen the types of events covered.

“One of the key drivers for this project was to remove latency from the equation when looking at live remote production”

We have many years of experience on latency and synchronisation of live IP, going back to our work on the 2012 Olympics and 2014 Commonwealth Games and we used this knowledge to underpin this project as it evolved.

IBC 365: What success has the project had in wider deployment beyond R&D?
Ian Wagdin: Working with our production colleagues in the BBC was a crucial enabler for the Nearly Live project and all our production workflows R&D. We took various iterations of the project to a number of events, and in 2018 we used the system to provide coverage for the Great Exhibition of the North in Newcastle.

BBC Scotland were amazing and gave us access to events over a number of years – from covering stages that would have otherwise not have been covered at T in the Park, to going to venues at the Edinburgh Festival Fringe that would not have been possible to cover with existing technologies.

Beyond the BBC, individual venues at performers, particularly at the Fringe, were really helpful and supportive.

“The outcomes from this project have formed vital knowledge in how this type of production may operate”

IBC365: What are the next steps?
Ian Wagdin: The outcomes from this project have formed vital knowledge in how this type of production may operate. We have seen similar commercial offerings emerge and therefore it does not make sense for us to continue prototyping in this area – our job is to stimulate thinking and investigate new ways of working, not to build product that competes with the wider industry.

The learnings have gone on to inform our work across a number of areas, including AI in production, IP and cloud-based working as well as 5G in production. We now know that these types of tools are useful and we are working on how we can improve and scale them to enable the vision of full remote object-based production using software-based tools.