神经网络来进行时间序列预测

原文地址:神经网络来进行时间序列预测 作者:争气的败家子
1、使用任意键盘响应绘图的问题
set(gcf,'KeyPressFcn','fplot(''sin'',[0 6])');
2、用神经网络来进行时间序列预测的程序
问题:
有一时间序列u=[17.6 17.7 17.7 17.7 17.8 17.8 17.9 18.0 18.1 18.2 18.4 18.6 ...
  18.7 18.9 19.1 19.3 19.6 19.9 20.2 20.6 21.0 21.5 22.0 22.7];
想用神经网络来预测后几项。
想法:以每4个连续数据作为输入,紧跟着的下一数据作为输出,现在一共23个数据,一共可以得到20组样本进行训练
程序:
clear
u=[17.6 17.7 17.7 17.7 17.8 17.8 17.9 18.0 18.1 18.2 18.4 18.6 ...
  18.7 18.9 19.1 19.3 19.6 19.9 20.2 20.6 21.0 21.5 22.0];
%接着下一个的期望输出是22.7;
n=length(u);
for i=1:(n-3)
  p(i,:)=u(i:i+3);
  if i~=n-3
    t(i)=u(i+4);
  else
    t(i)=22.7;
  end
end;

net=newff(minmax(p'),[n-3 20 1],{'logsig','logsig','purelin'},'trainlm');
net.trainparam.show=20;
net.trainparam.lr=0.01;
net.trainparam.epochs=300;
net.trainparam.goal=1e-3;
net.trainParam.mc=0.99;
net.trainParam.delt_inc=1.2;
net.trainParam.delt_dec=0.3;
net.trainParam.deltamax=50;
net.trainParam.time=inf;
net.trainParam.mem_reduc=1;
net.performFcn='sse'
net.layers{1}.initFcn = 'trainlm'
net.inputWeights{1,1}.initFcn = 'trainlm'
net.biases{1,1}.initFcn = 'trainlm'
net.inputWeights{2,1}.initFcn = 'trainlm'
net.biases{2,1}.initFcn = 'trainlm'

net=train(net,p',t)
sim(net,[20.6 21.0 21.5 22]')

结果:

22.7000[转载]神经网络来进行时间序列预测

3、BP神经网络例子神经网络数据融合的程序!!
一个神经网络数据融合

P1=[0.000,1.002,2.004,3.006,4.004,5.000,3.996,2.994,1.996,0.998,-0.001,1.001,2.003,3.005,4.003,4.999,
3.995,2.993,1.995,0.9970,-0.002,1.000,2.002,3.004,4.002,4.998,3.994,2.992,1.994,0.996,-0.003];
P2=[0.000,0.101,0.202,0.301,0.401,0.500,0.399,0.299,0.198,0.099,-0.001,0.102,0.203,0.300,0.402,0.499, 0.400,0.298,0.199,0.100,0.000,0.100,0.200,0.300,0.400,0.500,0.400,0.300,0.200,0.100,0.000 ]
P3=[0.000,0.501,1.002,1.503,2.001,2.500,1.999,1.497,0.998,0.499,-0.001,0.500,1.001,1.502,2.000,2.499, 1.998,1.498,0.997,0.498,0.001,0.500,1.002,1.500,2.000,2.500,1.999,1.498,1.000,0.498,0.000 ]
神经网络学习输出样本
T = [ 0
 1 2  3  4 5  4  3  2 1 0  1  2  3  4  5  4  3  2  1 0  1  2  3  4  5  4  3  2  1  0 ]

程序:

%归一化的样本
P0=[ 0.1449 ,0.03807,   0.4615;
        0.1457 ,0.03817,   0.4600
        0.2115 ,0.04599,   0.4459;
        0.3555 ,0.05962,   0.4517;
        0.4316 ,0.06570,   0.4615;
        0.4784 ,0.06917,   0.4449;
        0.6349 ,0.07968,   0.4517;
        0.7907 ,0.08892,   0.4478;
        0.6529 ,0.08080,   0.4658;
        0.8178, 0.09043,   0.4658;
  0.6331 , 0.0796 ,   0.4663
  0.3035 , 0.0551 ,   0.4615;
  0.4749 , 0.0689 , 0.4629;
  0.3946 , 0.0628 , 0.4595;
  0.6624 , 0.0814 , 0.4610;
  0.8303   , 0.0911   , 0.4620;
  0.3026 , 0.0550 , 0.4488;
  0.3492 ,   0.0591 ,   0.4639;
  0.5277 , 0.0726 , 0.4512;
  0.1469 , 0.0383 , 0.4678];
P=P0'
%T=[0.4925 0.4921 0.4897 0.5153 0.5300 0.5290 0.5566 0.5741 0.5661 0.5894 0.5656 0.5129 0.5391 0.5248 0.5642 0.5927 0.5044 0.5229 0.5372 0.4973];
T=[0.5144,0.5134,0.5148,0.5376,0.5419,0.5504,0.5703,0.5860,0.5732,0.5941,0.5747,0.5342,0.5500,0.5433,0.5742,0.5951,0.5305,0.5391,0.5560,0.5144];
%T_test=[0.5253 0.5875 0.5129 0.5419 0.5248 0.5357 0.5186 0.5737 0.5490 0.4949 0.5376 0.5609];    
T_test=[0.5405 0.5922 0.5300 0.5519 0.5405 0.5357 0.5808 0.5656 0.5153 0.5552 0.5666];
rand('state',0);   %让随机产生的种子数为0
net=newff(minmax(P),[9,1],{'tansig','purelin'},'trainlm');%新建BP神经网络
net.trainParam.epochs=1000;%bp网络训练次数
net.trainParam.goal=0.00001;%网络训练目标
net.trainParam.lr=0.001;%网络学习速率
net=train(net,P,T);%训练网络
input_test=[0.3672,0.0606,0.4658;
      0.8172,0.0904,0.4600;
      0.3102,0.0557,0.4576;
      0.4865 , 0.0697   , 0.4658;
  0.3901 ,   0.0625 , 0.4600;
 
  0.3730   , 0.0611 , 0.4561;
  0.6911,   0.0831 , 0.4683;
  0.6072 ,   0.0779 , 0.4537;
  0.2094 ,   0.0458 , 0.4537;
  0.5098 ,   0.0714 , 0.4537;
  0.6262 , 0.0791   , 0.4629];
Out=sim(net,P);%网络仿真
Out=Out'
error=Out-T'
res=norm(error);
output=sim(net,input_test');
errorout=output-T_test;
a=errorout'

你可能感兴趣的:(深度学习概念)