ARARX模型的辨识算法
Homework
Recursive extended least squares identification
In this section, I focus on the following ARMAX model
where {} and {} are respectively input and output series, is white noise, and are some polynomials with respect to the backward operator :
, , and for t ≤ 0.
I use the residual based recursive extened least squares algorithm.
The procedure of MATLAB:
clc;clear all;
t=500;
for k=1:t
u(k)=1;
end
v=randn(1,t)/100;
y(1)=v(1);y(2)=1.5*y(1)+0.5*u(1)+v(2)-v(1);
y(3)=1.5*y(2)-0.7*y(1)+0.5*u(1)+u(2)+v(3)-v(2)+0.2*v(1);
for k=4:t
y(k)=1.5*y(k-1)-0.7*y(k-2)+u(k-1)+0.5*u(k-2)+v(k)-v(k-1)+0.2*v(k-2);
end
s=zeros(6,t);
q=zeros(6,t);
p=100000*eye(6);
v1=zeros(1,t);
v1(1)=y(1);
q(1:6,2)=[-y(1) 0 u(1) 0 v1(1) 0];
L=p*q(1:6,2)/(1+q(1:6,2)'*p*q(1:6,2));
s(1:6,2)=L*y(2);
v1(2)=y(2)-q(1:6,2)'*s(1:6,2);
for k=3:t
q(1:6,k)=[-y(k-1) -y(k-2) u(k-1) u(k-2) v1(k-1) v1(k-2)];
L=p*q(1:6,k)/(1+q(1:6,k)'*p*q(1:6,k));
s(1:6,k)=s(1:6,k-1)+L*(y(k)-q(1:6,k)'*s(1:6,k-1));
p=(eye(6)-L*q(1:6,k)')*p;
v1(k)=y(k)-q(1:6,k)'*s(1:6,k);
end
a1=-1.5*ones(1,t);a2=0.7*ones(1,t);b1=ones(1,t);
b2=0.5*ones(1,t);d1=-ones(1,t);d2=0.2*ones(1,t);
i=1:1:t;
figure(1)
subplot(311);
plot(i,a1,'b',i,s(1,1:t),'r');legend('blue--给定参数a1 red--估计参数a11');
subplot(312);
plot(i,a2,'b',i,s(2,1:t),'r');legend('blue--给定参数a2 red--估计参数a22');
subplot(313);
plot(i,b1,'b',i,s(3,1:t),'r');legend('blue--给定参数b1 red--估计参数b11');
figure(2)
subplot(311);
plot(i,b2,'b',i,s(4,1:t),'r');legend('blue--给定参数b2 red--估计参数b22');
subplot(312);
plot(i,d1,'b',i,s(5,1:t),'r');legend('blue--给定参数d1 red--估计参数d11');
subplot(313);
plot(i,d2,'b',i,s(6,1:t),'r');legend('blue--给定参数d2 red--估计参数d22');
e1=zeros(1,t);e2=zeros(1,t);e3=zeros(1,t);e4=zeros(1,t);e5=zeros(1,t);e6=zeros(1,t);
for k=1:t
e1(k)=norm(s(1,k)+1.5,2)/norm(-1.5,2);
e2(k)=norm(s(2,k)-0.7,2)/norm(0.7,2);
e3(k)=norm(s(3,k)-1,2)/norm(1,2);
e4(k)=norm(s(4,k)-0.5,2)/norm(0.5,2);
e5(k)=norm(s(5,k)+1,2)/norm(-1,2);
e6(k)=norm(s(6,