2019-01-06[Stay Sharp] AdaBoost

What is AdaBoost?

AdaBoost, short for Adaptive Boosting, is a machine-learning meta-algorithm formulated by 2003 Gödel Prize winner Yoav Freund and Robert Schapire. It combines some other learning algorithms (weak learners) with weighted sum into the boosted classifier to improve performance. It's often written with the following:


where stands for the th weak learner and is the corresponding weight.

How does AdaBoost work?

2019-01-06[Stay Sharp] AdaBoost_第1张图片
From wikipedia

In short, AdaBoost processing includes:

  • retrain the algorithm iterativel by the training set transformed according to the accuracy of previous training
  • the weight of each weak classifier depends on the current accuracy.

Among all those equations above, the two important are:

for step 1
and

for step 2.

Prons and Cons

Prons

  • reduce bias and variance

Cons

  • sensitive to noisy data and outliers
  • likely to overfit
  • for sequential classifiers

References

https://medium.com/machine-learning-101/https-medium-com-savanpatel-chapter-6-adaboost-classifier-b945f330af06

https://en.wikipedia.org/wiki/AdaBoost

你可能感兴趣的:(2019-01-06[Stay Sharp] AdaBoost)