Due to proper camera synchronization, the produced panoramas exhibit neither ghosting effects nor other visual inconsistencies at the seams. Each Bagadus installation is capable of combining the video from five 2 K cameras into a single 50 fps cylindrical panorama video. The system is currently in use for soccer games at the Alfheim stadium for Tromsø IL and at the Ullevaal stadium for the Norwegian national soccer team. This is an essential part of our sports analytics system called Bagadus, which has several synchronization requirements. In our research, we require high-definition panorama videos generated in real time using several cameras in parallel. A critical aspect for visual quality in these systems is that the cameras are closely synchronized. Multi-camera systems are frequently used in applications such as panorama videos creation, free-viewpoint rendering, and 3D reconstruction. We demonstrate the performance and illustrate the trade-offs through real-world experiments where we can report comparable bandwidth savings to existing on-demand approaches, but with faster quality switches when the FoV changes. We have created an architecture that combines RTP and DASH, and our system multiplexes a single HEVC hardware decoder to provide faster quality switching than at the traditional GOP boundaries. Furthermore, in addition to researching an on-demand system, we also go beyond the existing on-demand solutions and present a live streaming system which strikes a trade-off between bandwidth usage and the video quality in the user’s FoV. In this paper, we present an efficient end-to-end design and real-world implementation of such a 360 ∘ streaming system. Using 360 ∘ VR content delivered to a Gear VR head-mounted display with a Samsung Galaxy S7 and to a Huawei Q22 set-top-box, we have tested various tiling schemes analyzing the tile layout, the tiling and encoding overheads, mechanisms for faster quality switching beyond the DASH segment boundaries and quality selection configurations. Many methods have already been proposed, and in this paper, we shed more light on the different trade-offs in order to save bandwidth while preserving the video quality in the user’s field-of-view (FoV). In this respect, the attention that 360 ∘ panorama video has received lately is huge. For a better user experience, we need perspective correction, and to scale the system up to thousands of spectators in a stadium, we need a scalable way of generating and delivering the individual views.ģ60 ∘ panorama video displayed through Virtual Reality (VR) glasses or large screens offers immersive user experiences, but as such technology becomes commonplace, the need for efficient streaming methods of such high-bitrate videos arises. Until recently, our virtual views were limited to cropping and scaling. This means that every user must be able to interact with the system in real-time to generate their own personal view from a single viewpoint. Our goal is to enable spectators in the stadium and supporters at home to use a part of the system with individual videos. Even though the system was originally meant for the coaches, a lot of this functionality is also interesting for outside users. Furthermore, the high-rate video streams must be transferred and processed in real-time. The cameras must again be synchronized with the sensor system to correctly identify the corresponding video frames and sensor records. This was accomplished by building a custom trigger box 1.
To stitch the individual cameras into a panorama video (see figure 1), the camera shutters must be synchronized. The scenario poses an array of challenges.