支持向量机--Hard-Vargin Support Vector Machine

为了便于理解Hard-Vargin Support Vector Machine的操作步骤,举例说明d = 1,2时如何实现二分类。

1.  d = 1

x = [-1;0;1];
y = [1; -1;-1];

[m,n] = size(x);
gscatter(x(:,1),zeros(m,1),y)

支持向量机--Hard-Vargin Support Vector Machine_第1张图片

%% 计算H = yiyjxixj

yy = y*y';
xx = x*x';
h = yy.*xx;
f = -1*ones(m,1);
Aeq= y';
Beq =0;
lb = zeros(m,1);
options = optimoptions('quadprog',...
'Algorithm','interior-point-convex','Display','off');



%% 得到 拉格朗日算子


[alpha,fval,exitflag,output,lambda] = ...
quadprog(h,f,[],[],Aeq,Beq,lb,[],[],options);


%% 计算权重w
w = sum(repmat(alpha.*y,1,n).*x,1);

% 计算b
S  = find(alpha>0.001);
b = 0;
for i = 1:length(S)
b = b + y(S(i)) - w*x(S(i),:)';
i = i+1;
end
b = b/length(S)


figure
gscatter(x(:,1),zeros(m,1),y)
hold on

%% 画出分界线
refline(w,b)


支持向量机--Hard-Vargin Support Vector Machine_第2张图片


2. d = 2

x = [1,1;2,1;0,6;1,5];
y = [1;1;-1;-1];
[m,n] = size(x);
gscatter(x(:,1),x(:,2),y)

待分类的数据点


支持向量机--Hard-Vargin Support Vector Machine_第3张图片

%% 计算W和b

yy = y*y';
xx = x*x';
h = yy.*xx;
f = -1*ones(m,1);
Aeq= y';
Beq =0;
lb = zeros(m,1);
options = optimoptions('quadprog',...
'Algorithm','interior-point-convex','Display','off');




[alpha,fval,exitflag,output,lambda] = ...
quadprog(h,f,[],[],Aeq,Beq,lb,[],[],options);




w = sum(repmat(alpha.*y,1,n).*x,1);
S  = find(alpha>0.001);
b = 0;
for i = 1:length(S)
b = b + y(S(i)) - w*x(S(i),:)';
i = i+1;
end
b = b/length(S)


%% 画出分界线

figure
gscatter(x(:,1),x(:,2),y)
px = 0:0.001:6;
py1 = (-1-b-w(1,1)*px)/w(1,2);
py2 = (1-b-w(1,1)*px)/w(1,2);


hold on
plot(px,py1,'r')
plot(px,py2,'r')

支持向量机--Hard-Vargin Support Vector Machine_第4张图片

你可能感兴趣的:(机器学习)