Regularizing Meta-learning via Gradient Dropout

Hung Yu Tseng, Yi Wen Chen, Yi Hsuan Tsai, Sifei Liu, Yen Yu Lin, Ming Hsuan Yang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

With the growing attention on learning-to-learn new tasks using only a few examples, meta-learning has been widely used in numerous problems such as few-shot classification, reinforcement learning, and domain generalization. However, meta-learning models are prone to overfitting when there are no sufficient training tasks for the meta-learners to generalize. Although existing approaches such as Dropout are widely used to address the overfitting problem, these methods are typically designed for regularizing models of a single task in supervised training. In this paper, we introduce a simple yet effective method to alleviate the risk of overfitting for gradient-based meta-learning. Specifically, during the gradient-based adaptation stage, we randomly drop the gradient in the inner-loop optimization of each parameter in deep neural networks, such that the augmented gradients improve generalization to new tasks. We present a general form of the proposed gradient dropout regularization and show that this term can be sampled from either the Bernoulli or Gaussian distribution. To validate the proposed method, we conduct extensive experiments and analysis on numerous computer vision tasks, demonstrating that the gradient dropout regularization mitigates the overfitting problem and improves the performance upon various gradient-based meta-learning frameworks.

Original languageEnglish
Title of host publicationComputer Vision – ACCV 2020 - 15th Asian Conference on Computer Vision, 2020, Revised Selected Papers
EditorsHiroshi Ishikawa, Cheng-Lin Liu, Tomas Pajdla, Jianbo Shi
PublisherSpringer Science and Business Media Deutschland GmbH
Pages218-234
Number of pages17
ISBN (Print)9783030695378
DOIs
Publication statusPublished - 2021
Event15th Asian Conference on Computer Vision, ACCV 2020 - Virtual, Online
Duration: 2020 Nov 302020 Dec 4

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12625 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference15th Asian Conference on Computer Vision, ACCV 2020
CityVirtual, Online
Period20/11/3020/12/4

Bibliographical note

Publisher Copyright:
© 2021, Springer Nature Switzerland AG.

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Regularizing Meta-learning via Gradient Dropout'. Together they form a unique fingerprint.

Cite this