Noisy label learning, semi-supervised learning, and contrastive learning are three different strategies for designing learning processes requiring less annotation cost.
作者将这三种方法融合起来,CSSL, a unified Contrastive Semi-Supervised Learning algorithm, and CoDiM (Contrastive DivideMix), a novel algorithm for learning with noisy labels。
然而当前的一些方法,在noise ratio比较高的时候,表现不佳。
Contrastive learning
Contrastive Learning (CL) approaches (Chen et al. 2020a; He et al. 2020; Chen et al. 2020b,c) have shown great potential on learning good representations by learning a feature extractor and a projector where in projection space, similar samples will be closer while dissimilar samples will be far apart。
现在对比学习可以用来实现:神经网络参数的初始化和无监督预训练的label corrector。
对比自监督学习可以学到更好的表示。
In an unsupervised manner, some methods treat different views from the same source as positive pairs, and views from different sources as negative pairs (Chen et al. 2020a). In a supervised way, with label supervision, views from the same class will be seen as positive pairs, and views from different classes will be regarded as negative pairs (Khosla et al. 2020)。
Semisupervised learning
Typical semi-supervised learning methods perform self-training by pseudo-labeling unlabeled data and design extra regularization objectives.
两种正则化的目标:
consistency regularization:encourages the model to generate consistent predictions on source data and randomly augmented views.
entropy minimization:low-entropy predictions with confidence