一致性正则化助力缓解鲁棒过拟合
1. 算法介绍
1.1 PGD - AT+MT 算法
以下是 PGD - AT+MT 算法的详细步骤:
Algorithm 1. PGD - AT+MT
Require: training set D; batch size m; learning rate lr; number of training epochs T ; PGD step size
α; number of PGD steps K; max perturbation budget ϵ; EMA smoothing parameter η; EMA
start epoch Es
Ensure: Robust model weight θt
1: Randomly initialize θs
2: for t = 1 to T do
3:
for i = 1 to m do
4:
Sample xi from D
5:
x′
i ←xi + ϵδ, where δ ∼Uniform(−1, 1)
6:
for k = 1 to J do
7:
if t < Es then
8:
x′
i ←Πϵ
x′
i + α sign
∇x′
iLce
f
x′
超级会员免费看
订阅专栏 解锁全文
678

被折叠的 条评论
为什么被折叠?



