Learning from Better Supervision: Self-distillation for Learning with Noisy Labels

Kyungjune Baek, Seungho Lee, Hyunjung Shim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The remarkable performance of deep neural networks heavily rely on large-scale datasets with high-quality annotations. Since the data collection process such as web crawling naturally involves unreliable supervision (i.e., noisy label), handling samples with noisy labels has been actively studied. Existing methods in learning with noisy labels (LNL) 1) develop the sampling strategy for filtering out the noisy labels or 2) devise the robust loss function against noisy labels. As a result of these efforts, existing LNL models achieve impressive performance, recording a higher accuracy than the ratio of the clean samples in the dataset. Based on this observation, we propose a self-distillation framework to utilize the prediction of existing LNL models and further improve the performance via rectified distillation; hard pseudo label and feature distillation. Our rectified distillation can be easily applied to existing LNL models, thus we can enjoy their state-of-the-art performances. From extensive evaluations, we confirm that our model is effective on both synthetic and real noisy datasets with state-of-the-art performances on four benchmark datasets.

Original languageEnglish
Title of host publication2022 26th International Conference on Pattern Recognition, ICPR 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1829-1835
Number of pages7
ISBN (Electronic)9781665490627
DOIs
Publication statusPublished - 2022
Event26th International Conference on Pattern Recognition, ICPR 2022 - Montreal, Canada
Duration: 2022 Aug 212022 Aug 25

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume2022-August
ISSN (Print)1051-4651

Conference

Conference26th International Conference on Pattern Recognition, ICPR 2022
Country/TerritoryCanada
CityMontreal
Period22/8/2122/8/25

Bibliographical note

Funding Information:
VI. ACKNOWLEDGE This research was supported by the Basic Science Research Program through the NRF Korea funded by the MSIP (NRF-2022R1A2C3011154), IITP grant funded by the Korea government(MSIT) and KEIT grant funded by the Korea government(MOTIE) (No. 2022-0-00680, 2022-0-01045), and the Korea Medical Device Development Fund grant funded by the Korean government (Project Number: 202011D06).

Publisher Copyright:
© 2022 IEEE.

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Learning from Better Supervision: Self-distillation for Learning with Noisy Labels'. Together they form a unique fingerprint.

Cite this