Confidence Calibration for Incremental Learning

Dongmin Kang, Yeonsik Jo, Yeongwoo Nam, Jonghyun Choi

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Class incremental learning is an online learning paradigm wherein the classes to be recognized are gradually increased with limited memory, storing only a partial set of examples of past tasks. At a task transition, we observe an unintentional imbalance of confidence or likelihood between the classes of the past and the new task. We argue that the imbalance aggravates a catastrophic forgetting for class incremental learning. We propose a simple yet effective learning objective to balance the confidence of classes of old tasks and new task in the class incremental learning setup. In addition, we compare various sample memory configuring strategies and propose a novel sample memory management policy to alleviate the forgetting further. The proposed method outperforms the state of the arts in many evaluation metrics including accuracy and forgetting F by a large margin (up to 5.71% in A10 and 17.1% in F10) in extensive empirical validations on multiple visual recognition datasets such as CIFAR100, TinyImageNet and a subset of the ImageNet.

Original languageEnglish
Article number9133417
Pages (from-to)126648-126660
Number of pages13
JournalIEEE Access
Volume8
DOIs
Publication statusPublished - 2020

Bibliographical note

Funding Information:
This work was supported by Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (No.2019-0-01351, Development of Ultra Low-Power Mobile Deep Learning Semiconductor With Compression/Decompression of Activation/Kernel Data).

Publisher Copyright:
© 2013 IEEE.

All Science Journal Classification (ASJC) codes

  • Computer Science(all)
  • Materials Science(all)
  • Engineering(all)

Fingerprint

Dive into the research topics of 'Confidence Calibration for Incremental Learning'. Together they form a unique fingerprint.

Cite this