机器学习:5.1 模型组合 Model Combination

@[toc]

Bias & Variance Decomposition

  • Learn from dataset D sampled from

  • Evaluate generalization error on a new data point
    E_D[(y-\hat{f}_D(x))^2] = E_D[((f-E_D[\hat{f}_D])-(\hat{f}_D -E_D[\hat{f}_D])+\varepsilon)^2]\\ = (f-E_D[\hat{f}_D])^2+E_D[(\hat{f}_D-E_D[\hat{f}_D])^2]+\varepsilon^2\\ = Bias[\hat{f}_D]^2+Var[\hat{f}_D]+\varepsilon^2

Reduce Bias & Variance

  • Reduce bias
    • A more complex model
      • e.g. increase layers, hidden units of MLP
      • Boosting, Stacking
  • Reduce variance
    • A simpler model
      • e.g. regularization
      • Bagging, Stacking
  • Reduce
    • Improve data
  • Ensemble learning: train and combine multiple models to improve predictive performance

你可能感兴趣的:(机器学习:5.1 模型组合 Model Combination)