Authors: Matthias Ziegler, Joachim Keinert, Nina Holzer, Thorsten Wolf, Tobias Jaschke, Ron op het Veld, Faezeh Sadat Zakeri and Siegfried Foessel

ABSTRACT

The advent of high-resolution head-mounted displays (HMDs) enables new applications for virtual and augmented reality. The ability to freely move and look around in the scene with six degrees of freedom creates the feeling of being part of a fascinating story.  Based on today’s technology, content for six degrees of freedom is most easily built from computer-generated content.

In order to make the story even more convincing, we propose to add photorealistic footage to the scene. Ultimately, this will allow integration of live-action elements with real actors and famous places for a more believable experience allowing the spectator to dive into the story.

In this work we present a corresponding lightfield workflow consisting of multi-camera capture, post-production and integration of live-action video into a VR-environment. Thanks to high quality depth maps, visual effects like camera movements, depth-guided colour correction and integration of CGI elements are easily achievable, allowing generating both 2D movies and VR content. 

INTRODUCTION

Today, use of Computer Generated Imagery (CGI) is a key element in movie production. Techniques like matchmoving allow the placement of actors in a computer-generated 3D environment. This requires the virtual camera to follow the path of the real camera in order to obtain a consistent sequence composed of live-action and CG-elements. Up to now, the final result of such compositing was always a fixed sequence of 2D or stereo images. A spectator had no option to change his point-of-view of the scene.

This situation changed when the first head-mounted displays (HMDs) appeared. For the first time it became possible for the audience to experience full 6 degrees of freedom (6DOF) virtual reality (VR). A spectator can move freely through the scene with proper change in perspective. This effect is also known as motion-parallax.

A VR compositing workflow needs to deliver content that allows for such 6-DOF. As before, live-action content and CG content need to be consistent. For a long time, 6-DOF content that could be presented on such HMDs was limited to CGI. Against this background we propose a novel workflow that provides an immersive VR experience for live-action video. Our workflow incorporates a portable camera array capturing an actor.

The obtained footage is processed using a set of specifically designed plug-ins for the compositing software NUKE. The processing thereby reconstructs a dense lightfield from a multi-camera input. Finally, we can import the 6-DOF live-action content into the Unreal Engine (UE) which is used as a playback platform. In combination with standard 3D elements we can create a 6-DOF VR experience that features significant head motion-parallax in the natural scene elements.

Download the full technical paper below