GD:特征放缩Feature Scaling&归一化Mean Normalization

如果我们输入的样本的特征值都大致在相同的取值范围内会加快梯度下降的速度。这是因为θ在小范围内下降快。We can speed up gradient descent by having each of our input values in roughly the same range. This is because θ will descend quickly on small ranges and slowly on large ranges, and so will oscillate inefficiently down to the optimum when the variables are very uneven.

Feature Scaling: input value/(max-min)

Mean Normalization(归一化):


GD:特征放缩Feature Scaling&归一化Mean Normalization_第1张图片
s可以是range也可以是均方差

你可能感兴趣的:(GD:特征放缩Feature Scaling&归一化Mean Normalization)