lecture1机器学习by李宏毅


Regression

step1:model

y=b+w*x cp

step2:goodness of function

loss function:how bad the function is


step3:best function

令Lf最小


lecture1机器学习by李宏毅_第1张图片

GRADIENT DESCENT:协助计算偏微分

lecture1机器学习by李宏毅_第2张图片
lenear 无需考虑此问题
lecture1机器学习by李宏毅_第3张图片

Overfitting:A more complex model does not always lead to better performance on testing data.

hidden factor->

back to step1:redesign the model

back to step2:regularizaton

越平滑,对noise越不敏感。training data误差越小,但testing data误差不一定


lecture1机器学习by李宏毅_第4张图片

你可能感兴趣的:(lecture1机器学习by李宏毅)