Establishing visual correspondence is one of the most fundamental tasks in many applications of computer vision fields. In this paper we propose a robust image matching to address the affine variation problems between two images taken under different viewpoints. Unlike the conventional approach finding the correspondence with local feature matching on fully affine transformed-images, which provides many outliers with a time consuming scheme, our approach is to find only one global correspondence and then utilizes the local feature matching to estimate the most reliable inliers between two images. In order to estimate a global image correspondence very fast as varying affine transformation in affine space of reference and query images, we employ a Bhattacharyya similarity measure between two images. Furthermore, an adaptive tree with affine transformation model is employed to dramatically reduce the computational complexity. Our approach represents the satisfactory results for severe affine transformed-images while providing a very low computational time. Experimental results show that the proposed affine-invariant image matching is twice faster than the state-of-the-art methods at least, and provides better correspondence performance under viewpoint change conditions.