MATLAB 神经网络函数

MATLAB 神经网络函数

1.首先打开神经网络拟合GUI(nnstart)

MATLAB 神经网络函数_第1张图片

2.点击 fitting app,进入主窗口

MATLAB 神经网络函数_第2张图片

3.网络创建 数据获取

MATLAB 神经网络函数_第3张图片

4.导入数据(这里是导入的matlab自身的数据)
下面就可以进行样本分配

MATLAB 神经网络函数_第4张图片
5.进行网络结构设置
分为三部分:Hidden Layer,Recommendation,Neural Network

MATLAB 神经网络函数_第5张图片
6.这一步就可以进行对数据的网络训练
MATLAB 神经网络函数_第6张图片
7.继续对它进行神经网络训练

MATLAB 神经网络函数_第7张图片
8.对它进行回归分析,R值分析,期望

MATLAB 神经网络函数_第8张图片
9.继续对它进行误差分析,得到柱状图

MATLAB 神经网络函数_第9张图片
10.继而对它进行网络评定

MATLAB 神经网络函数_第10张图片
11.最后对它进行结果保存

MATLAB 神经网络函数_第11张图片

保存的脚本文件如下:
1.Simple Script

% Solve an Input-Output Fitting problem with a Neural Network
% Script generated by Neural Fitting app
% Created 28-Mar-2022 16:05:51
%
% This script assumes these variables are defined:
%
%   abaloneInputs - input data.
%   abaloneTargets - target data.

x = abaloneInputs;
t = abaloneTargets;

% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm';  % Levenberg-Marquardt backpropagation.

% Create a Fitting Network
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize,trainFcn);

% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;

% Train the Network
[net,tr] = train(net,x,t);

% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)

% View the Network
view(net)

% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotfit(net,x,t)


2.Advanced Script

% Solve an Input-Output Fitting problem with a Neural Network
% Script generated by Neural Fitting app
% Created 28-Mar-2022 16:18:23
%
% This script assumes these variables are defined:
%
%   abaloneInputs - input data.
%   abaloneTargets - target data.

x = abaloneInputs;
t = abaloneTargets;

% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm';  % Levenberg-Marquardt backpropagation.

% Create a Fitting Network
hiddenLayerSize = 10;
net = fitnet(hiddenLayerSize,trainFcn);

% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess
net.input.processFcns = {'removeconstantrows','mapminmax'};
net.output.processFcns = {'removeconstantrows','mapminmax'};

% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivision
net.divideFcn = 'dividerand';  % Divide data randomly
net.divideMode = 'sample';  % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;

% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse';  % Mean Squared Error

% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
    'plotregression', 'plotfit'};

% Train the Network
[net,tr] = train(net,x,t);

% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)

% Recalculate Training, Validation and Test Performance
trainTargets = t .* tr.trainMask{1};
valTargets = t .* tr.valMask{1};
testTargets = t .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)

% View the Network
view(net)

% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotfit(net,x,t)

% Deployment
% Change the (false) values to (true) to enable the following code blocks.
% See the help for each generation function for more information.
if (false)
    % Generate MATLAB function for neural network for application
    % deployment in MATLAB scripts or with MATLAB Compiler and Builder
    % tools, or simply to examine the calculations your trained neural
    % network performs.
    genFunction(net,'myNeuralNetworkFunction');
    y = myNeuralNetworkFunction(x);
end
if (false)
    % Generate a matrix-only MATLAB function for neural network code
    % generation with MATLAB Coder tools.
    genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
    y = myNeuralNetworkFunction(x);
end
if (false)
    % Generate a Simulink diagram for simulation or deployment with.
    % Simulink Coder tools.
    gensim(net);
end

你可能感兴趣的:(MATLAB,知识点,神经网络,matlab,matlab1,网络层)