Corrected Contrastive Robust Representation for Learning with Noisy Labels
Main Article Content
Abstract
Contrastive learning has a powerful ability to embed data into a latent representation space and the learned representations are useful for various following tasks. Learning with noisy labels has been one of the most practical but challenging tasks. Combining contrastive representation learning to reduce the negative impact of noisy labels on classification task is an unresolved problem. To our knowledge, this is the first attempt for alleviating noisy labels by incorporating the power of contrastive learning and label correction strategies. This work proposes a novel holistic framework, Corrected Contrastive Learning, to learn robust representations and cope with noisy labels from both representation space and label space. The framework mainly consists of two components. The first one is robust representation learning, which is used to generate better feature embedding to guide classification tasks in label space. The second one is differential label correction, which is used for obtaining relatively large confident sets, yields more confident sample pairs for the first representation learning. Experiments on multiple noisy labeled datasets have demonstrated the superiority of our framework, providing a positive attempt for combating with noisy labels.
Article Details
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.