Abstract
Despite the long history of image and video stitching research, existing academic and commercial solutions still produce strong artifacts. In this work, we propose a wide-baseline video stitching algorithm for linear camera arrays that is temporally stable and tolerant to strong parallax. Our key insight is that stitching can be cast as a problem of learning a smooth spatial interpolation between the input videos. To solve this problem, inspired by pushbroom cameras, we introduce a fast pushbroom interpolation layer and propose a novel pushbroom stitching network, which learns a dense flow field to smoothly align the multiple input videos for spatial interpolation. Our approach outperforms the state-of-the-art by a significant margin, as we show with a user study, and has immediate applications in many areas such as virtual reality, immersive telepresence, autonomous driving, and video surveillance.
Original language | English |
---|---|
Publication status | Published - 2020 |
Event | 30th British Machine Vision Conference, BMVC 2019 - Cardiff, United Kingdom Duration: 2019 Sept 9 → 2019 Sept 12 |
Conference
Conference | 30th British Machine Vision Conference, BMVC 2019 |
---|---|
Country/Territory | United Kingdom |
City | Cardiff |
Period | 19/9/9 → 19/9/12 |
Bibliographical note
Publisher Copyright:© 2019. The copyright of this document resides with its authors.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition