贝叶斯线性回归

Bayesian Linear Regression youtube video

  • Why not use MLE?  Overfitting!
  • How to solve?
    • Why not use MAP? NO representation of our uncertenty in w and Y
贝叶斯线性回归_第1张图片 MLE, blue one is the predict point
贝叶斯线性回归_第2张图片 baesian linear regression, green line mean uncertenty
  •  Why Bayesian? Optimize loss fn.
    • predicive distribution: Given us p(y|x,D)=what we really want

Setup problem

D=((x_{1},y_{1}),...,(x_{n},y_{n})),x_{i}\in \mathbb{R}^{d},y_{i}\in \mathbb{R}

Model: y_{1},...,y_{n} indep given w, y_{i}\sim N(w^{T}x_{i},a^{-1})    a = \frac{1}{\sigma^{2}}  \sigma means variance

w\sim N(0,b^{-1}I),b>0. (w=(w_{1},...,w_{d})) Assume a, b are known (\theta=w)

Remark, repalce x_{i} by \varphi _{i}(x_{i}) = (\varphi _{1}(x_{1}),...,\varphi _{i}(x_{i}))  ---> Non-linearilites in x by basis function

Likelihood:

p(D|w)\propto exp(-\frac{1}{2}(y-Aw)^{T}(y-Aw))

其中, A = \left ( x_{1}^{T},...,x_{n}^{T} \right )  "Design Matrix"

 

你可能感兴趣的:(概率论)