This paper looks at the software that Phillips Group Innovation developed for robust multi-camera calibration and calibration monitoring and present experimental results for both artificial and real captured data.
Dense multi-camera setups consisting of hundreds of low-cost cameras positioned around a sports arena could provide spectacular look-around effects and close-up views using computational photography techniques. In addition, the availability of the many viewing angles will allow for detailed computer vision and sports analytics.
However, a practical difficulty is the calibration of these large-scale setups before a match and the challenge of keeping the system calibrated (in real-time) during a match. External factors such as a cheering crowd, wind, or a passing car can cause mechanical vibrations of the small cameras and even the smallest rotation will cause errors in rendering or analysis.
In this paper, we investigate how to best initially calibrate a dense multi camera array and how to keep it calibrated while going live. We discuss the software that we developed for robust multi-camera calibration and calibration monitoring and present experimental results for both artificial and real captured data.
Placing many low-cost cameras around a sports field brings new possibilities for 3D sports analysis and immersive viewing. For instance, being able to select the best perspective from more than 100 cameras will lead to a better analysis and judgement call for critical moments in sports. For the consumer the large number of cameras means that an immersive look-around effects can be produced using depth image-based rendering. The result can be viewed on a smartphone or even on a virtual reality headset for an immersive experience. Figure 1 shows a system diagram with algorithms (left) and an experimental outdoor six-camera setup that we built (right).
Download the paper below