优点:
简单且易于实现
缺点:
如果数据是线性可分的,并且是二分类的,则可以以下函数模型表示输入到输出的关系:
将所有误分点到超平面距离之和表示为代价函数:
不考虑
,得到感知器的代价函数:
说明:李航的书用L(w,b)表示代价函数,而Ng教程用J()表示代价函数。
# @Author: Tianze Tang
# @Date: 2017-07-10
# @Email: [email protected]
# @Last modified by: Tianze Tang
# @Last modified time: 2017-07-10
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import cv2
import random
import time
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
class Perceptron(object):
def __init__(self):
self.learning_step = 1
self.max_iteration = 100000
def sign(self,x):
if (x >= 0):
logic=1
else:
logic=0
return logic
# w*x+b
def threshold(self,w,b,x):
result = np.dot(w ,x) + b;
return result
# train
def train(self,x,y):
w = np.zeros(len(x[0]))
b = 0
i = 0
while (i0,index-1)
if (y[random_number]* self.threshold(w,b,x[random_number])<= 0):
w = w + self.learning_step * y[random_number]*x[random_number]
b = b + self.learning_step * y[random_number]
i = i + 1
return w,b
x = np.array([[3,3],[4,3],[1,1]],dtype= int)
y = np.array([1,1,-1])
plt.plot([3,4],[3,3],'rx')
plt.plot([1],[1],'b*')
plt.axis([0,6,0,6])
test = Perceptron()
w,b=test.train(x,y)
# w*x+b=0
y1=(-b-w[0]*1)/w[1]
x2=(-b-w[1]*1)/w[0]
plt.plot([1,y1],[x2,1],'g')
plt.show()
[1] 李航《统计学习方法》第二章——用Python实现感知器模型(MNIST数据集)
[2] Matplotlib Pyplot tutorial