Mini-Batch Gradient Descent

Mini-Batch Gradient Descent

1. What is Mini-Batch Gradient Descent?

Mini-Batch Gradient Descent is an algorithm between the Batch Gradient Descent and Stochastic Gradient Descent. Concretly, this use some(not one or all) examples(M) for each iteration.

2. Compute Effort

The compute time of this algorithm depends on the examples. It not stable, but the worst case is like Batch Gradient Descent: O( N2 )

The table below shows the different among these there Gradient Descent

Batch Gradient Descent Mini-Batch Gradient Descent Stochastic Gradient Descent
use 1 example in each iteration use some examples use all example in each iteration
relative compute loose somewhat in between relative compute intensive

3. Gradient Descent Formula

For all θi

Jθθi=1mi=1M[hθ(xi)yi](xi)

E.g.,
two parameters θ0,θ1 –> hθ(x)=θ0+θ1x1

For i = 0 :

Jθθ0=1mi=1M[hθ(xi)yi](x0)

For i = 1:

Jθθ1=1mi=1M[hθ(xi)yi](x1)

Note that the datasets need to be shuffled before iteration.

你可能感兴趣的:(机器学习-算法,机器学习算法整理)