DASC

Robust Dense Descriptor for Multi-Modal and Multi-Spectral Correspondence Estimation

Seungryong Kim, Dongbo Min, Bumsub Ham, Minh N. Do, Kwanghoon Sohn

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

Establishing dense correspondences between multiple images is a fundamental task in many applications. However, finding a reliable correspondence between multi-modal or multi-spectral images still remains unsolved due to their challenging photometric and geometric variations. In this paper, we propose a novel dense descriptor, called dense adaptive self-correlation (DASC), to estimate dense multi-modal and multi-spectral correspondences. Based on an observation that self-similarity existing within images is robust to imaging modality variations, we define the descriptor with a series of an adaptive self-correlation similarity measure between patches sampled by a randomized receptive field pooling, in which a sampling pattern is obtained using a discriminative learning. The computational redundancy of dense descriptors is dramatically reduced by applying fast edge-aware filtering. Furthermore, in order to address geometric variations including scale and rotation, we propose a geometry-invariant DASC (GI-DASC) descriptor that effectively leverages the DASC through a superpixel-based representation. For a quantitative evaluation of the GI-DASC, we build a novel multi-modal benchmark as varying photometric and geometric conditions. Experimental results demonstrate the outstanding performance of the DASC and GI-DASC in many cases of dense multi-modal and multi-spectral correspondences.

Original languageEnglish
Article number7585122
Pages (from-to)1712-1729
Number of pages18
JournalIEEE transactions on pattern analysis and machine intelligence
Volume39
Issue number9
DOIs
Publication statusPublished - 2017 Sep 1

Fingerprint

Descriptors
Correspondence
Geometry
Invariant
Redundancy
Receptive Field
Multispectral Images
Pooling
Sampling
Quantitative Evaluation
Imaging techniques
Self-similarity
Similarity Measure
Leverage
Modality
Patch
Filtering
Imaging
Benchmark
Series

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Cite this

@article{aeeb4f8a41f54ae0ad5e19578fe8359c,
title = "DASC: Robust Dense Descriptor for Multi-Modal and Multi-Spectral Correspondence Estimation",
abstract = "Establishing dense correspondences between multiple images is a fundamental task in many applications. However, finding a reliable correspondence between multi-modal or multi-spectral images still remains unsolved due to their challenging photometric and geometric variations. In this paper, we propose a novel dense descriptor, called dense adaptive self-correlation (DASC), to estimate dense multi-modal and multi-spectral correspondences. Based on an observation that self-similarity existing within images is robust to imaging modality variations, we define the descriptor with a series of an adaptive self-correlation similarity measure between patches sampled by a randomized receptive field pooling, in which a sampling pattern is obtained using a discriminative learning. The computational redundancy of dense descriptors is dramatically reduced by applying fast edge-aware filtering. Furthermore, in order to address geometric variations including scale and rotation, we propose a geometry-invariant DASC (GI-DASC) descriptor that effectively leverages the DASC through a superpixel-based representation. For a quantitative evaluation of the GI-DASC, we build a novel multi-modal benchmark as varying photometric and geometric conditions. Experimental results demonstrate the outstanding performance of the DASC and GI-DASC in many cases of dense multi-modal and multi-spectral correspondences.",
author = "Seungryong Kim and Dongbo Min and Bumsub Ham and Do, {Minh N.} and Kwanghoon Sohn",
year = "2017",
month = "9",
day = "1",
doi = "10.1109/TPAMI.2016.2615619",
language = "English",
volume = "39",
pages = "1712--1729",
journal = "IEEE Transactions on Pattern Analysis and Machine Intelligence",
issn = "0162-8828",
publisher = "IEEE Computer Society",
number = "9",

}

DASC : Robust Dense Descriptor for Multi-Modal and Multi-Spectral Correspondence Estimation. / Kim, Seungryong; Min, Dongbo; Ham, Bumsub; Do, Minh N.; Sohn, Kwanghoon.

In: IEEE transactions on pattern analysis and machine intelligence, Vol. 39, No. 9, 7585122, 01.09.2017, p. 1712-1729.

Research output: Contribution to journalArticle

TY - JOUR

T1 - DASC

T2 - Robust Dense Descriptor for Multi-Modal and Multi-Spectral Correspondence Estimation

AU - Kim, Seungryong

AU - Min, Dongbo

AU - Ham, Bumsub

AU - Do, Minh N.

AU - Sohn, Kwanghoon

PY - 2017/9/1

Y1 - 2017/9/1

N2 - Establishing dense correspondences between multiple images is a fundamental task in many applications. However, finding a reliable correspondence between multi-modal or multi-spectral images still remains unsolved due to their challenging photometric and geometric variations. In this paper, we propose a novel dense descriptor, called dense adaptive self-correlation (DASC), to estimate dense multi-modal and multi-spectral correspondences. Based on an observation that self-similarity existing within images is robust to imaging modality variations, we define the descriptor with a series of an adaptive self-correlation similarity measure between patches sampled by a randomized receptive field pooling, in which a sampling pattern is obtained using a discriminative learning. The computational redundancy of dense descriptors is dramatically reduced by applying fast edge-aware filtering. Furthermore, in order to address geometric variations including scale and rotation, we propose a geometry-invariant DASC (GI-DASC) descriptor that effectively leverages the DASC through a superpixel-based representation. For a quantitative evaluation of the GI-DASC, we build a novel multi-modal benchmark as varying photometric and geometric conditions. Experimental results demonstrate the outstanding performance of the DASC and GI-DASC in many cases of dense multi-modal and multi-spectral correspondences.

AB - Establishing dense correspondences between multiple images is a fundamental task in many applications. However, finding a reliable correspondence between multi-modal or multi-spectral images still remains unsolved due to their challenging photometric and geometric variations. In this paper, we propose a novel dense descriptor, called dense adaptive self-correlation (DASC), to estimate dense multi-modal and multi-spectral correspondences. Based on an observation that self-similarity existing within images is robust to imaging modality variations, we define the descriptor with a series of an adaptive self-correlation similarity measure between patches sampled by a randomized receptive field pooling, in which a sampling pattern is obtained using a discriminative learning. The computational redundancy of dense descriptors is dramatically reduced by applying fast edge-aware filtering. Furthermore, in order to address geometric variations including scale and rotation, we propose a geometry-invariant DASC (GI-DASC) descriptor that effectively leverages the DASC through a superpixel-based representation. For a quantitative evaluation of the GI-DASC, we build a novel multi-modal benchmark as varying photometric and geometric conditions. Experimental results demonstrate the outstanding performance of the DASC and GI-DASC in many cases of dense multi-modal and multi-spectral correspondences.

UR - http://www.scopus.com/inward/record.url?scp=85029365917&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85029365917&partnerID=8YFLogxK

U2 - 10.1109/TPAMI.2016.2615619

DO - 10.1109/TPAMI.2016.2615619

M3 - Article

VL - 39

SP - 1712

EP - 1729

JO - IEEE Transactions on Pattern Analysis and Machine Intelligence

JF - IEEE Transactions on Pattern Analysis and Machine Intelligence

SN - 0162-8828

IS - 9

M1 - 7585122

ER -