Establishing visual correspondence is one of the most fundamental tasks in many applications of computer vision fields. In this paper we propose a robust image matching to address the affine variation problems between two images taken under different viewpoints. Unlike the conventional approach finding the correspondence with local feature matching on fully affine transformed-images, which provides many outliers with a time consuming scheme, our approach is to find only one global correspondence and then utilizes the local feature matching to estimate the most reliable inliers between two images. In order to estimate a global image correspondence very fast as varying affine transformation in affine space of reference and query images, we employ a Bhattacharyya similarity measure between two images. Furthermore, an adaptive tree with affine transformation model is employed to dramatically reduce the computational complexity. Our approach represents the satisfactory results for severe affine transformed-images while providing a very low computational time. Experimental results show that the proposed affine-invariant image matching is twice faster than the state-of-the-art methods at least, and provides better correspondence performance under viewpoint change conditions.
|Title of host publication||2015 IEEE International Conference on Image Processing, ICIP 2015 - Proceedings|
|Publisher||IEEE Computer Society|
|Number of pages||5|
|Publication status||Published - 2015 Dec 9|
|Event||IEEE International Conference on Image Processing, ICIP 2015 - Quebec City, Canada|
Duration: 2015 Sep 27 → 2015 Sep 30
|Name||Proceedings - International Conference on Image Processing, ICIP|
|Other||IEEE International Conference on Image Processing, ICIP 2015|
|Period||15/9/27 → 15/9/30|
Bibliographical notePublisher Copyright:
© 2015 IEEE.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition
- Signal Processing