Real-time disparity estimation using foreground segmentation for stereo sequences

Hansung Kim, Dong Bo Min, Shinwoo Choi, Kwanghoon Sohn

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

We propose a fast disparity estimation algorithm using background registration and object segmentation for stereo sequences from fixed cameras. Dense background disparity information i calculated in an initialization step, so that only disparities of moving object regions are updated in the main process. We propose a real-time segmentation technique using background subtraction and interframe differences, and a hierarchical disparity estimation using a region-dividing technique and shape-adaptive matching windows. Experimental results show that the proposed algorithm provides accurate disparity vector fields with an average processing speed of 15 frames/s for x 320 × 240 stereo sequences on an ordinary PC.

Original languageEnglish
Article number037402
JournalOptical Engineering
Volume45
Issue number3
DOIs
Publication statusPublished - 2006 Mar

Bibliographical note

Funding Information:
We would like to thank Dr. D. Scharstein and Dr. R. Szeliski for supplying the ground truth data on their home page, and Dr. Y. Ohta and Dr. Y. Nakamura for the imagery from the University of Tsukuba. This work was partly supported by the Ministry of Information and Communication, Korea, under the Information Technology Research Center support program supervised by the Institute of Information Technology Assessment and partly supported by the Ministry of Education and Human Resources Development, the Ministry of Commerce, Industry and Energy, and the Ministry of Labor through the fostering project of the Lab of Excellency.

All Science Journal Classification (ASJC) codes

  • Atomic and Molecular Physics, and Optics
  • Engineering(all)

Fingerprint Dive into the research topics of 'Real-time disparity estimation using foreground segmentation for stereo sequences'. Together they form a unique fingerprint.

Cite this