机器学习-林轩田 Chapter 8 noise & error

[图片上传中。。。(1)]

8-1:VC Bound work under noise
[P(x,y) - joint probability; P(y|x) - target distribution]
[关于 PLA/POCKETS ( pockets = modified PLA ): http://www.jianshu.com/p/9e4f4bb27476]
[图片上传中。。。(2)]
在 A 中使得 Ein 很小,而且 Ein 约等于 Eout

8-2: Error Measure
G 三个特性: out of sample; pointwise( evaluated on each point ); classification(分类对错与否, 0/1 error)
Err: pointwise error measure( 以后主要讨论 err )
[图片上传中。。。(3)]
If there is noise, f(x_n)->y

two err:
0/1 error
Squared error

How does err ‘guide’ machine learning
Ideal Mini-target f(x)
For 0/1 error: fx = argmax P(y|x)
For squared error: fx = SUM( y * P(y|x) )

Actually, extended VC theory/‘philosophy’ works for most H and error.

8-3: choice of error measure
Two types of error: false accept, false reject

Err is application/user-dependent

Algorithmic error measures
True: just err
Plausible
0/1: min ‘flipping noise’(翻转噪音),but really hard to find
Squared: minimum gaussian noise

Friendly: easy to optimize for A -> make Ein min; need users to tell us what they need
Closed-form solution 很容易求解
Convex objective function 求导
more

8-4: weighted classification
Naive thoughts:
PLA
Pocket: change the w in pocket if find a better w

Connect Einw and Ein0/1

机器学习-林轩田 Chapter 8 noise & error_第1张图片

weighted PLA: 增加拜访 y=-1 的几率至 1000 倍。
迭代 w 的值。

你可能感兴趣的:(机器学习-林轩田 Chapter 8 noise & error)