The remarkable performance of deep neural networks heavily rely on large-scale datasets with high-quality annotations. Since the data collection process such as web crawling naturally involves unreliable supervision (i.e., noisy label), handling samples with noisy labels has been actively studied. Existing methods in learning with noisy labels (LNL) 1) develop the sampling strategy for filtering out the noisy labels or 2) devise the robust loss function against noisy labels. As a result of these efforts, existing LNL models achieve impressive performance, recording a higher accuracy than the ratio of the clean samples in the dataset. Based on this observation, we propose a self-distillation framework to utilize the prediction of existing LNL models and further improve the performance via rectified distillation; hard pseudo label and feature distillation. Our rectified distillation can be easily applied to existing LNL models, thus we can enjoy their state-of-the-art performances. From extensive evaluations, we confirm that our model is effective on both synthetic and real noisy datasets with state-of-the-art performances on four benchmark datasets.
|Title of host publication||2022 26th International Conference on Pattern Recognition, ICPR 2022|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||7|
|Publication status||Published - 2022|
|Event||26th International Conference on Pattern Recognition, ICPR 2022 - Montreal, Canada|
Duration: 2022 Aug 21 → 2022 Aug 25
|Name||Proceedings - International Conference on Pattern Recognition|
|Conference||26th International Conference on Pattern Recognition, ICPR 2022|
|Period||22/8/21 → 22/8/25|
Bibliographical noteFunding Information:
VI. ACKNOWLEDGE This research was supported by the Basic Science Research Program through the NRF Korea funded by the MSIP (NRF-2022R1A2C3011154), IITP grant funded by the Korea government(MSIT) and KEIT grant funded by the Korea government(MOTIE) (No. 2022-0-00680, 2022-0-01045), and the Korea Medical Device Development Fund grant funded by the Korean government (Project Number: 202011D06).
© 2022 IEEE.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition