The PBS Short Film Festival has returned in virtual form, part of a multi-platform initiative to increase the reach and visibility of independent filmmakers from across the US. Michael Burns looks at how the Digital Innovation team at PBS developed it as a VR experience using a range of AWS services.

The annual PBS Short Film Festival is currently running at a virtual screen very near you. Every day until 31 August 2021, audiences can watch a curated series of stories from independent filmmakers via an immersive WebXR beta experience called Screen on the Green.

Angled Cinema view Screenshot with settings

PBS Short Film Festival: Runs unti 31 August

This comprises 25 short films that play back-to-back and is accessible via compatible VR headsets, including Oculus Quest, or through a web browser. Up to 300 participants can enter the experience at a time and choose from one of two environments, a daytime setting with cityscape views or a moonlit outdoor landscape.

At the centre of each space sits a large-scale outdoor cinema screen, on which all 25 films play consecutively. As participants join, they can see the space populate with avatars and explore hidden Easter eggs, such as film posters or ways to support the not-for-profit US public broadcaster.

In previous years, the festival has held in-person screenings in the Washington, DC area, but pandemic restrictions caused the organisers to consider an alternative VR experience.

To create this, the Digital Innovation team at PBS took advantage of a host of technology from Amazon Web Services (AWS), including AWS Media Services, Amazon Simple Storage Service (Amazon S3) and Amazon CloudFront.

“Screen on the Green celebrates stories that are often underrepresented in filmmaking in a unique format that allows participants to experience these remarkable, emotional moments that the filmmakers intended together,” says Mikey Centrella, director of product management, Digital Innovation Team at PBS.

“Filmmaking is an art form that’s hard to recreate in a linear format. Rather than broadcast the films, or stream them via a social platform, we wanted to transport audiences to the cinema.”

“The pandemic provided a unique opportunity for us to test a VR format for the festival, especially as remote work has helped audiences become more comfortable with being immersed in a video experience, and VR technology has improved,” he adds.

VR view Screenshot2

Screen on the Green: Provides an immersive experience

Centrella is fully aware that not everyone out there can view the ‘Screen’ with VR hardware.

“Access to a VR headset can be a financial barrier. So we built an experience that could be available to anyone with an internet connection and computer, using WebGL and WebXR,” he says.

“We are trying to introduce viewers to a new way of watching PBS content in an immersive, virtual environment through the web, and solicit feedback to inform future releases or new products in VR/AR. This will help us to gauge the appetite for viewing PBS content in XR and if the community has interest, the team will be better prepared to evaluate future PBS digital products and technologies in this space.”

That said, the VR concept does go some way to replicating the feel of a film festival in the physical world.

“You would be amazed how far a hand wave or body tilt can go in communicating feelings in VR,” says Centrella.

“This experience is intentionally all about watching the films together virtually like in a cinema, where complete strangers or best friends can be in the experience together simultaneously.”

While this shouldn’t be seen as similar to the ‘watch party’ phenomenon that’s taken the sports world by storm, the concept could lead to some greater audience involvement. “Just like in a real movie theatre, you can’t or really shouldn’t talk to one another when the movie is playing so, at the moment, participants cannot speak to each other,” says Centrella.

“If the audience finds value in viewing PBS content in VR, the digital team may consider adding new features like reactions, polls, audio chat or have a live host like a filmmaker at a later date or for a future product.”

Building the Screen

The PBS Innovation team chose AWS Cloud Services to support the effort. “[We wanted to] ensure we could design a virtual environment for streaming the PBS Short Film Festival that would be remarkable and stable,” says Centrella.

“The team had already been using AWS Cloud Services for its own sandbox to help us experiment, power and deploy prototypes in artificial intelligence, voice assistants, machine learning, virtual reality, augmented reality, live streaming and a recommendation engine,” he adds.

“But recently new services like Media Tailor and Channel Assembly became available and are uniquely fit for the use case for the Film Festival VR Experience.”

“We wanted participants to have a sense of community and spatial awareness of others alongside them,” Mikey Centrella, PBS

According to Dave Levy, AWS vice president of US Government, Nonprofit and Healthcare, the cloud giant collaborated closely with the PBS Digital Innovation Team from inception to final delivery of its VR festival “working to understand the team’s vision and provide the support required to execute it”.

“Early on in the project’s development, PBS and AWS recognised that recreating the in-person theatre experience in VR meant that PBS would need a way to continuously stream the festival videos in the virtual world to hundreds of users in a synchronised fashion,” continues Levy.

“With an AWS cloud infrastructure built on services like Amazon S3 already established, the PBS Innovation team envisioned a file-based transcoding and channel assembly solution using AWS Elemental MediaConvert and AWS Elemental MediaTailor.”

All Screen on the Green video content is housed in PBS’ proprietary Media Manager solution, which distributes each MP4 short file to Amazon S3 and Amazon CloudFront.

To create the film festival stream, file-based video transcoding service MediaConvert is used to transform the MP4 files into HLS playlists. Channel Assembly within MediaTailor uses the HLS playlists to create a linear stream ready for distribution on the experience as a single output.

The stream gets ported into the video player supporting the WebXR experience, which lives on a web page hosted from an Amazon S3 bucket.

“Without MediaConvert and Channel Assembly in MediaTailor, turning the short films into a linear channel in HLS format to make this possible would have proven labour intensive and cost prohibitive,” says Levy.

“AWS’s MediaTailor technology [completely] allowed us to realise the vision of a fairly quickly deployed linear channel that scales across potentially hundreds of viewers,” agrees Centrella.

“Our Media Manager solution was built in house and is part of the bigger services picture at PBS; we utilised it to feed media to MediaTailor to create streams and to provide metadata for users to enjoy additional context about the films during their viewing experience.

”It’s also worth noting that the client, the part that users experience in their browser, is a component unto itself made of many smaller pieces that had to be built.”

The team also ensured that any time a participant logged on, they would experience the same moment together without delay.

Amazon Elastic Container Service (ECS) and Amazon ElastiCache for Redis enabled the team to scale up its backend infrastructure and find the desired processing power and throughput required to deliver real-time viewer avatar poses, which make for a more authentic experience as users navigate the world and see other avatars in the space,” says Levy.

VR view Screenshot

Screen on the Green: Celebrates underrepresented stories

“ElastiCache was a great fit for the goal of keeping hundreds of users in sync at one time,” says Centrella.

“We wanted participants to have a sense of community and spatial awareness of others alongside them. So we needed a way to collect and broadcast viewers’ real-time positions in a scalable way, but hit a wall with performance. We moved to a setup that would spread the computations across multiple instances.

”During development and load testing, we kept track of the server’s ability to keep up with the 33ms that we expected it to handle in order to broadcast 30 updates per second to all users. As we added more users during load testing, we monitored the cadence and identified that it was able to consistently meet that 33ms tempo up to about 600 users/connections. To be safe, we set a cap at 500 users, so the experience would always remain stable.”

Working together

AWS provided support and guidance to PBS throughout the process, but there were challenges. According to Centrella one issue was “finding the right combination of AWS services and leveraging them to provide a reasonably low latency real-time experience while having an inkling at best of the traffic we’d get”.

“Trying out new emerging technology doesn’t have to be complicated or cost prohibitive,” Dave Levy, AWS

“The proper configuration of AWS services to be functional and provide the scale we needed was probably more time-consuming than the actual authoring of the code,” he adds. “The AWS side of the experience was not particularly code heavy. For context, the Node.js [web app] server code is under 100 lines.”

Other challenges concerned the VR components, such as control aspects of the media viewing experience, debugging within the VR context, as well as “developing for desktop and VR users and trying to come up with a user experience that suits two very different ways of interacting in a 3D world”, adds Centrella.

But the reaction has been positive.

“We’ve been asking participants to leave feedback on their way out,” says Centrella. “75% of respondents said they would use this experience if it was available all the time to watch PBS content, not just films. Other comments are along the lines of: ‘Wow, super cool that I could access this via my laptop and wish I had a VR headset to experience this. Love the short films and diverse avatars!’”

“Internally, the AWS team is thrilled with the result, and we expect PBS audiences will be pleased with it as well,” says Levy. “The collaboration is representative of our continued commitment to helping M&E customers worldwide innovate to transform the entertainment experience.”

Cinema view Screenshot

PBS’ Digital Innovation team used of a host of technology from AWS

“As PBS demonstrated with this project, trying out new emerging technology doesn’t have to be complicated or cost prohibitive,” he adds.

“Even with a small team of technology developers, the scalability and flexibility of the cloud provides a strong foundation for experimentation and innovation that can ultimately help drive the development of more engaging audience experiences.”

The PBS team certainly remains focused on rapidly building and testing ideas. “Even though PBS may not have a VR channel yet, we’re not writing off the idea for the future and a lot of the work we’re doing with AWS will provide the groundwork for future innovation in this arena,” says Centrella.

And what advice would PBS give to similar organisations wishing to try something of this nature?

“Start by determining the essence of the experience. Who is it meant for? What is the one thing you want them to do?” says Centrella. “Then build out a feature set from there. But keep the list small. Make sure every feature somehow connects back to the main essence and, if it doesn’t, put it in the backlog.

“The sky’s the limit in VR, but that can be a problem. Being boundless can often result in having a product that doesn’t deliver for the end user and has features that just don’t make sense.”