Mini-Batch Gradient Descent
1. What is Mini-Batch Gradient Descent?
Mini-Batch Gradient Descent is an algorithm between the Batch Gradient Descent and Stochastic Gradient Descent. Concretly, this use some(not one or all) examples(M) for each iteration.
2. Compute Effort
The compute time of this algorithm depends on the examples. It not stable, but the worst case is like Batch Gradient Descent: O(N2)
The table below shows the different among these there Gradient Descent
| Batch Gradient Descent | Mini-Batch Gradient Descent | Stochastic Gradient Descent |
|---|---|---|
| use 1 example in each iteration | use some examples | use all example in each iteration |
| relative compute loose | somewhat in between | relative compute intensive |
3. Gradient Descent Formula
For all θi
E.g.,
two parameters θ0,θ1 –> hθ(x)=θ0+θ1x1
For i = 0 :
For i = 1:
Note that the datasets need to be shuffled before iteration.
Mini-Batch梯度下降详解

本文详细介绍了Mini-Batch梯度下降算法,这是一种介于批量梯度下降和随机梯度下降之间的方法。通过使用部分样本进行迭代更新,该算法在计算效率与收敛速度之间取得了较好的平衡。
1899

被折叠的 条评论
为什么被折叠?



