Learning Semantic Correspondence Exploiting an Object-Level Prior

Junghyup Lee, Dohyung Kim, Wonkyung Lee, Jean Ponce, Bumsub Ham

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

We address the problem of semantic correspondence, that is, establishing a dense flow field between images depicting different instances of the same object or scene category. We propose to use images annotated with binary foreground masks and subjected to synthetic geometric deformations to train a convolutional neural network (CNN) for this task. Using these masks as part of the supervisory signal provides an object-level prior for the semantic correspondence task and offers a good compromise between semantic flow methods, where the amount of training data is limited by the cost of manually selecting point correspondences, and semantic alignment ones, where the regression of a single global geometric transformation between images may be sensitive to image-specific details such as background clutter. We propose a new CNN architecture, dubbed SFNet, which implements this idea. It leverages a new and differentiable version of the argmax function for end-to-end training, with a loss that combines mask and flow consistency with smoothness terms. Experimental results demonstrate the effectiveness of our approach, which significantly outperforms the state of the art on standard benchmarks.

Original languageEnglish
Pages (from-to)1399-1414
Number of pages16
JournalIEEE transactions on pattern analysis and machine intelligence
Volume44
Issue number3
DOIs
Publication statusPublished - 2022 Mar 1

Bibliographical note

Publisher Copyright:
© 1979-2012 IEEE.

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Learning Semantic Correspondence Exploiting an Object-Level Prior'. Together they form a unique fingerprint.

Cite this