该网络用于模拟一个sin函数,具体实现如下:
%http://blog.csdn.net/superdont 我思故我在 clc; clear; P=-1:0.1:1; %初始值的原始值,该值要经过一系列的岁渐变换用以显示ANN的特性 P2=-1:0.1:1; %用以衡量大小 % T=[0.8 0.66 0.461 0.1336 ... % -0.1 -0.201 -0.434 -0.5 -0.393 -0.1647 0.0988 0.3072 ... % 0.396 0.449 0.5816 0.6312 0.7183 0.9201... % 0.86 0.77 0.65]; T=[ 0.5403 0.6216 0.6967 0.7648 0.8253 0.8776 0.9211 0.9553 0.9801 0.9950 1.0000... 0.9950 0.9801 0.9553 0.9211 0.8776 0.8253 0.7648 0.6967 0.6216 0.5403] %T为终止值,正好是对应点的sin值 %在途中画出终止值的各个点,r+表示用红色显示 plot(P,T,'r+'); %对初始值P进行随机变换 %%%%%P随机变换开始 [R,Q]=size(P); [S2,Q]=size(T); S1=5; [W1,B1]=rands(S1,R); [W2,B2]=rands(S2,S1); b1=[];b2=[]; b1=B1*ones(1,21); b2=B2*ones(1,21); a2=W2*tansig(W1*P2+b1)+b2; %Hyperbolic tangent sigmoid transfer function A2=purelin(a2); %Linear transfer function %%%%%P随机变换结束 hold on %显示随机变换后的P的值 plot(P,A2,'g') hold off % disp('按任一键继续') % pause net=newcf(minmax(P),[5,1],{'tansig','purelin'},'traingd'); %创建两层前向回馈网络 % newcf(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF) takes, % PR -- R x 2 matrix of min and max values for R input elements % Si -- Size of ith layer, for Nl layers % TFi -- Transfer function of ith layer, default = 'tansig' % BTF -- Backpropagation network training function, default = 'traingd' % BLF -- Backpropagation weight/bias learning function, default = 'learngdm' % PF -- Performance function, default = 'mse net.trainParam.epochs=7000; %初始化训练次数 net.trainParam.goal=9.5238e-005; % sse=0.02 %初始化误差值 %当e的指数为1、2、3时速度较快,但是误差较大,1最大,4比较合适,5以后速度很慢,几乎无法忍受... %但5以后准确度比较高 net.trainParam.lr = 0.15; %lr为学习速率,太大不稳定,太小则运算速度慢 %lr为0.2事用时:Elapsed time is 112.687000 seconds. %lr为0.15时用时:Elapsed time is 117.610000 seconds. %计时开始 tic; [net,tr]=train(net,P,T); %训练网络 % net -- New network % TR -- Training record (epoch and perf) Y=sim(net,P) ; %计算结果 plot(P,Y,'r-') hold plot(P,T,'r+'); hold off toc