This paper describes the concepts and results implemented by the European FP7 Dreamspace project.
Dreamspace is developing a new platform and tools for collaborative virtual production of visual effects in film and TV and new immersive experiences.
The aim of the project is to enable creative professionals to combine live performances, video and computer-generated imagery in real-time. In particular the project has developed tools allowing on-set manipulation of 3D assets, live integration of video feeds from tracked cameras and live-compositing of either CGI content or background plates from panoramic video, captured by Omnidirectional video rigs.
The CGI content is lit by automatically captured studio lighting using a new real-time global illumination rendering system. Furthermore, Dreamspace is investigating the use of omnidirectional video and 3D assets in new immersive user experiences.
The film and TV industry is seeking ways of producing audio-visual media that combines the real world, CGI and 3D animation in ever increasing quality but at lower cost.
The use of CGI in movie and TV productions has reached a degree that makes it hard to identify what parts of a production are real and which are virtual. However, the traditional two- phase approach of on-set filming and integration of visual effects later in a post-production phase has proven to be a major bottleneck in terms of creativity and cost-effectiveness.
The European FP7 Dreamspace project (1) has developed new techniques and workflows to provide full creative control over the virtual (computer-generated) components in production with real-time visualisation, and continuity of the data and creative decisions through to post-production.
This includes: intuitive on-set manipulation of 3D assets; live camera tracking and compositing with depth for visualisation; and capture of on-set lighting and real-time global illumination rendering to harmonise real and virtual elements.
All this is done with panoramic video to provide low-cost photo-real environments. The project has also explored the impact on film production and the cross-over to create new immersive experiences for live performance and installation art using emerging head-mounted displays and head-tracked projection screens.
This paper introduces the concepts that have been developed in Dreamspace. It combines leading research and commercial organizations in imaging, visual production and creative experiences, having a total of seven partners.
The text provides an overview, followed by a more detailed description of the data capture and processing, the collaborative virtual production environment and an overview of some of the creative virtual productions carried out in the project.
Dreamspace has created an end-to-end pipeline for data capture, processing and rendering with control of virtual elements to produce a visualisation environment for use on set in both film and TV. It can also be part of an immersive space for installation or performance art.
Figure 1 shows how the technologies fit a conventional production pipeline with capture of video back-plates using: omni-directional video camera rigs; on-set capture of light models to harmonise real and virtual lights in rendering; real-time integration of real and virtual elements through live camera tracking with depth capture; high-performance global illumination rendering and real-time compositing for live visualization; collaborative editing tools to control digital content; and finally, data continuity through to post-production to deliver the final camera track and final composite.
The technologies and prototypes focus on providing greater control, streamlining the workflow and pipeline, connecting the different phases of production and delivering high quality content at a lower cost.
These targets span virtual production and the performance arts, where the challenge is the same - how to enhance creativity and experimentation through real-time visualization and interaction with accessible tools and workflows.
DOWNLOAD THE FULL TECH PAPER BELOW