Abstract
We propose to leverage a continuous and large stream of unlabeled data in the wild to alleviate catastrophic forgetting in class-incremental learning. Our experimental results on CIFAR and ImageNet datasets demonstrate the superiority of the proposed methods over prior methods: compared to the state-of-the-art method, our proposed method shows up to 14.9% higher accuracy and 45.9% less forgetting.
Original language | English |
---|---|
Title of host publication | Proceedings - 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 |
Publisher | IEEE Computer Society |
Pages | 29-32 |
Number of pages | 4 |
ISBN (Electronic) | 9781728125060 |
Publication status | Published - 2019 Jun |
Event | 32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 - Long Beach, United States Duration: 2019 Jun 16 → 2019 Jun 20 |
Publication series
Name | IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops |
---|---|
Volume | 2019-June |
ISSN (Print) | 2160-7508 |
ISSN (Electronic) | 2160-7516 |
Conference
Conference | 32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 |
---|---|
Country/Territory | United States |
City | Long Beach |
Period | 19/6/16 → 19/6/20 |
Bibliographical note
Publisher Copyright:© 2019 IEEE Computer Society. All rights reserved.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition
- Electrical and Electronic Engineering