We propose to leverage a continuous and large stream of unlabeled data in the wild to alleviate catastrophic forgetting in class-incremental learning. Our experimental results on CIFAR and ImageNet datasets demonstrate the superiority of the proposed methods over prior methods: compared to the state-of-the-art method, our proposed method shows up to 14.9% higher accuracy and 45.9% less forgetting.
|Title of host publication||Proceedings - 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019|
|Publisher||IEEE Computer Society|
|Number of pages||4|
|Publication status||Published - 2019 Jun|
|Event||32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 - Long Beach, United States|
Duration: 2019 Jun 16 → 2019 Jun 20
|Name||IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops|
|Conference||32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019|
|Period||19/6/16 → 19/6/20|
Bibliographical notePublisher Copyright:
© 2019 IEEE Computer Society. All rights reserved.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition
- Electrical and Electronic Engineering