【论文阅读】Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning

该论文探讨了在图像分类中的半监督学习方法,提出使用网络预测生成软伪标签来学习无标记数据。研究发现,简单的伪标签方法可能会因为确认偏差导致过拟合不正确的标签。为解决此问题,论文引入了MixUp增强和设置每个小批量的最小标记样本数作为有效的正则化技术,减少了确认偏差,并在CIFAR-10/100,SVHN和Mini-ImageNet等数据集上取得了最先进的结果,甚至优于一致性正则化方法。

论文下载
GitHub
bib:

@INPROCEEDINGS{
   
   ,
  title		= {
   
   Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning},
  author	= {
   
   Eric Arazo and Diego Ortego and Paul Albert and Noel E O'Connor and Kevin McGuinness},
  booktitle	= {
   
   IJCNN},
  year		= {
   
   2020},
  pages     = {
   
   1--8}
}

1. 摘要

Semi-supervised learning, i.e. jointly learning from labeled and unlabeled samples, is an active research topic due to its key role on relaxing human supervision.

总览半监督学习。

In the context of image classification, recent advances to learn from unlabeled samples are mainly focused on consistency regularization methods that encourage invariant predictions for different perturbations of unlabeled samples.

提到半监督分类中的一致性正则。

We, conversely, propose to learn from unlabeled data by generating soft pseudo-labels using the network predictions.

提到本文中适用了伪标签技术(soft pseudo-labels)。

We show that a naive pseudo-labeling overfits to incorrect pseudo-labels due to the so-called confirmation bias and demonstrate that mixup augmentation and setting a minimum number of labeled samples per mini-batch are effective regularization techniques for reducing it.

核心的贡献。提出了确认偏差(confirmation bias),本文贡献是证明了mixup augmentationsetting a minimum number of labeled samples per mini-batch是有效减少确认偏差的正则技术。

The proposed approach achieves state-of-the-art results in CIFAR-10/100, SVHN, and Mini-ImageNet despite being much simpler than other methods.

These results demonstrate that pseudo-labeling alone can outperform consistency regularization methods, while the opposite was supposed in previous work.

这一点就很令人惊讶了,伪标签技术的方法超过了一致性正则的方法。还没看原文,应该是还没有出现FixMatchFlexMatch方法。

2. 算法描述

符号 意义
D l = { ( x i , y i ) } i = 1 N l D_l = \{(x_i, y_i)\}^{N_l}_{i=1} Dl={(xi,yi)}i=1Nl 有标记数据
D u = { x i } i = 1 N u D_u = \{x_i\}^{N_u}_{i=1} Du={ xi}i=1Nu 无标记数据
D ~ u = { ( x i , y ~ i } i = 1 N \widetilde{D}_u = \{(x_i, \widetilde{y}_i\}^{N}_{i=1} D u={(xi,y
评论 1
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

来日可期1314

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值