SVM分割超平面的绘制与SVC.decision_function( )的功能

在李航老师的《统计学习方法》— 支持向量机那章有个例题:
样本点 x1=(3,3),x2=(4,3),x3=(1,1),labels=(1,1,1) ,求分割超平面?

  • 先说decision_function()的功能:计算样本点到分割超平面的函数距离
    • 没错,是函数距离(将几何距离,进行了归一化,具体看书)
    • x1=(3,3),x2=(4,3),x3=(1,1),labels=(1,1,1) 带入决策函数decision_function( ),也就是分割超平面 12x1+12x22 即可得到函数距离:1, 1.5, -1,其中1 和 -1 刚好在margin边缘上, x1,x3 也就是支持向量。
  • 以下是计算绘制分割超平面:
"""
=========================================
SVM: Maximum margin separating hyperplane
=========================================

Plot the maximum margin separating hyperplane within a two-class
separable dataset using a Support Vector Machine classifier with
linear kernel.
"""
print(__doc__)

import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm

# we create 40 separable points
np.random.seed(0)

X = np.array([[3,3],[4,3],[1,1]])
Y = np.array([1,1,-1])

# fit the model
clf = svm.SVC(kernel='linear')
clf.fit(X, Y)

# get the separating hyperplane
w = clf.coef_[0]
a = -w[0] / w[1]
xx = np.linspace(-5, 5)
yy = a * xx - (clf.intercept_[0]) / w[1]

# plot the parallels to the separating hyperplane that pass through the
# support vectors
b = clf.support_vectors_[0]
yy_down = a * xx + (b[1] - a * b[0])
b = clf.support_vectors_[-1]
yy_up = a * xx + (b[1] - a * b[0])

# plot the line, the points, and the nearest vectors to the plane
plt.plot(xx, yy, 'k-')
plt.plot(xx, yy_down, 'k--')
plt.plot(xx, yy_up, 'k--')

plt.scatter(clf.support_vectors_[:, 0], clf.support_vectors_[:, 1],
            s=80, facecolors='none')
plt.scatter(X[:, 0], X[:, 1], c=Y, cmap=plt.cm.Paired)

plt.axis('tight')
plt.show()

print clf.decision_function(X)

SVM分割超平面的绘制与SVC.decision_function( )的功能_第1张图片

你可能感兴趣的:(matplot)