LDA 线性判别分析

Linear Discriminant Analsys (LDA) is like PCA, but is focuses on maximzing the seperatibility among known categories.

Between PCA and LDA

  • Both rank a new axis
  • Both try to reduce dimensions
    • the way to PCA is chosen the most variation and poor performance on seperatibility of categories.
    • the way to LDA is focuses on maximize the seperatibility of categories.

A simple example——Reducing a 2-D to a 1-D

The new axis is created according to two criteria (considered simultaneously) by LDA.

  • first step is maximizing the distance of two means (two groups of feature)
  • then miximizing the seperatibility of categories. In other words mixmizing the variation of the scatter with each category.

If the means and scatter both considered, maybe we can get a nice rezult.

OK, if we consider LDA that dig in 3 categories. What about the differences between these?

  • first step is find the center point of all the data point.
  • then measure the distances between this main center point and three center point of each category.
  • in next, maximizing the distance between the means for each category and the main center point while mixmizing the scatter in the words dispersion in each category.

Let’s have a summary: Creating the new axis that maximizing the distances between the means for each categories while mixmizing the scatter of each category.

你可能感兴趣的:(机器学习-成长之路,机器学习)