1 Proximal gradient descent
Unconstrained Problem with cost function split into two components.
minf(x)=g(x)+h(x)
where
- g(x) : convex, differentiable, and dom(g)=Rn ;
- h(x) :convex, nondifferentiable, but its proximal function is inexpensive. The proximal function is defined as
Proxh(x)=argminu(h(u)+

本文介绍了Proximal Gradient Descent算法,该算法用于解决包含凸且可微分部分g(x)和凸非微分部分h(x)的无约束优化问题。当h(x)为0时,算法退化为梯度下降法;当h(x)为IA(x)时,转化为投影到集合A上;若h(x)为L1范数,则更新规则涉及软阈值操作。
最低0.47元/天 解锁文章
1458

被折叠的 条评论
为什么被折叠?



