Few-shot viewpoint estimation

Hung Yu Tseng, Shalini De Mello, Jonathan Tremblay, Sifei Liu, Stan Birchfield, Ming Hsuan Yang, Jan Kautz

Research output: Contribution to conferencePaperpeer-review

3 Citations (Scopus)

Abstract

Viewpoint estimation for known categories of objects has been improved significantly thanks to deep networks and large datasets, but generalization to unknown categories is still very challenging. With an aim towards improving performance on unknown categories, we introduce the problem of category-level few-shot viewpoint estimation. We design a novel framework to successfully train viewpoint networks for new categories with few examples (10 or less). We formulate the problem as one of learning to estimate category-specific 3D canonical shapes, their associated depth estimates, and semantic 2D keypoints. We apply meta-learning to learn weights for our network that are amenable to category-specific few-shot fine-tuning. Furthermore, we design a flexible meta-Siamese network that maximizes information sharing during meta-learning. Through extensive experimentation on the ObjectNet3D and Pascal3D+ benchmark datasets, we demonstrate that our framework, which we call MetaView, significantly outperforms fine-tuning the state-of-the-art models with few examples, and that the specific architectural innovations of our method are crucial to achieving good performance.

Original languageEnglish
Publication statusPublished - 2020
Event30th British Machine Vision Conference, BMVC 2019 - Cardiff, United Kingdom
Duration: 2019 Sept 92019 Sept 12

Conference

Conference30th British Machine Vision Conference, BMVC 2019
Country/TerritoryUnited Kingdom
CityCardiff
Period19/9/919/9/12

Bibliographical note

Publisher Copyright:
© 2019. The copyright of this document resides with its authors.

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Few-shot viewpoint estimation'. Together they form a unique fingerprint.

Cite this