AI 笔记 Week 10 Machine Learning

This week you should finish Lesson 7, Machine Learning, and read Chapter 18.6-11 & 20.3 in Russell & Norvig.

Assignment 4: Decision Trees
Due: October 29 at 11:59PM UTC-12 (Anywhere on Earth time)


Boosting

AI 笔记 Week 10 Machine Learning_第1张图片
Boosting
AI 笔记 Week 10 Machine Learning_第2张图片
Boosting example
  • ??? I don't know how to calculate e2 and e3???


    AI 笔记 Week 10 Machine Learning_第3张图片
    Boosting example
AI 笔记 Week 10 Machine Learning_第4张图片
Boosting quiz

Neural nets

AI 笔记 Week 10 Machine Learning_第5张图片
Neura nets
AI 笔记 Week 10 Machine Learning_第6张图片
AI 笔记 Week 10 Machine Learning_第7张图片
simple equation can do generalized computation

Quiz: Neural Nets Quiz

AI 笔记 Week 10 Machine Learning_第8张图片
quiz

Fill in the truth table for NOR and find weights such that:

a = { true if w0 + i1 w1 + i2 w2 > 0, else false }

Truth table
Enter 1 for True, and 0 (or leave blank) for False in each cell.
All combinations of i1 and i2 must be specified.
Weights
Each weight must be a number between 0.0 and 1.0, accurate to one or two decimal places.
w1 and w2 are the input weights corresponding to i1 and i2 respectively.
w0 is the bias weight.
Activation function
Choose the simplest activation function that can be used to capture this relationship.

Multilayer Nets

AI 笔记 Week 10 Machine Learning_第9张图片
  • neural nets only makes sense when activation functions are nonlinear. If they are linear, the who network can be reduced to a linear function thus lose the power of the network.

Perceptron Learning

AI 笔记 Week 10 Machine Learning_第10张图片
  • single layer perceptron can only generate linear boundaries.
AI 笔记 Week 10 Machine Learning_第11张图片
Comparison between decision tree and perceptron
  • the performance of perceptrons is not always better than other methods (e.g. decision tree). it can be improved, however, by adding more layers

Multilayer Perceptrons

AI 笔记 Week 10 Machine Learning_第12张图片
Multilayer Perceptrons

Back-Propagation

AI 笔记 Week 10 Machine Learning_第13张图片
  • Back-Propagation is the way to calculate neural nets.
  • the harder a problem is, the longer time for the algorithm to converge. Below are examples how fast an algorithm converges.
AI 笔记 Week 10 Machine Learning_第14张图片
AI 笔记 Week 10 Machine Learning_第15张图片

Deep Learning

  • neural nets have limitations: need computation power, need more training set but still can be limited on the types of problem it suits.

Unsupervised Learning

  • or classification. The unsupervised learning algorithm classify data into sub-classes and figure out each which class each case fits in.

k-Means and EM

AI 笔记 Week 10 Machine Learning_第16张图片
k-Means
  • K-means start with randomly initiating the means and generate decision boundaries to separate the data set. the means of the separated data are then recalculated. Then new decision try will be generated to classify the data again. repeat the process until there is no change in the classification anymore.
  • this is the expectation-maximization procedure
  • for data that is hard to converge or avoid local maxima, random restart technique can be used.

EM and Mixture of Gaussians

AI 笔记 Week 10 Machine Learning_第17张图片
  • instead of means, we can use k-Gaussians with the EM procedure to do classification.

Readings on EM and Mixture Models

  • AIMA: Chapter 20.3
  • PRML: Chapter 9.0-9.2 Mixture Models and EM
  • *PRML = Pattern Recognition and Machine Learning, Christopher Bishop
    Research articles
  • Using GPS to Learn Significant Locations and Predict Movement Across Multiple Users, Daniel Ashbrook and Thad Starner
20171101 初稿

你可能感兴趣的:(AI 笔记 Week 10 Machine Learning)