一、introduction
深度学习对于图像识别
二、using pretrained Networks
1、加载并显示图像
img1 = imread('file01.jpg');
imshow(img1)
2、预测
deepnet = alexnet; %获取预训练模型
pred1 = classify(deepnet, img1); %预测img1
3、获取其他预训练模型
4、examine network layers
deepnet = alexnet; %获取预训练网络
ly = deepnet.Layers;%获取网络layers
inlayer = ly(1); %获取输入层结构
insz = inlayer.InputSize; %获取输入层size
outlayer = ly(end); %获取输出层
categorynames = outlayer.Classes; %获取最后一层的class
5、investigating predictions
分类函数返回输入图像的预测类,但是有办法知道网络对这个分类有多“自信”吗?在决定如何处理输出时,考虑这种信心可能很重要。
为了将输入分类为n个类中的一个,神经网络有一个由n个神经元组成的输出层,每个神经元对应一个类。通过网络传递输入结果是为每个神经元计算一个数值。这些数值表示网络对属于每个类的输入概率的预测。
img = imread('file01.jpg');
imshow(img)
net = alexnet;
categorynames = net.Layers(end).ClassNames;
[pred, scores] = classify(net, img); %获得预测结果和自信分数
bar(scores); %Display scores
highscores = scores > 0.01; %Threshold scores
bar(scores(highscores)); %Display thresholded scores
xticklabels(categorynames(highscores)); %Add tick labels
三、managing collections of data
1、creating a datastore
ls *.jpg
net = alexnet;
imds = imageDatastore('file*.jpg'); %创建datastore
fname = imds.Files; %提取文件名
img = readimage(imds, 7); %读取图像
preds = classify(net, imds); %图片分类
2、 Preparing Images to Use as Input: Adjust input images
Process Images for Classification
img = imread('file01.jpg');
imshow(img);
sz = size(img); %读取图像大小
net = alexnet;
insz = net.Layers(1).InputSize; %输入层图像大小
img = imresize(img, [227, 227]);
imshow(img);
3、Processing Images in a Datastore: (2/3) Creating an augmented image datastore
Resize Images in a Datastore
ls *.jpg
net = alexnet;
imds = imageDatastore('*.jpg');
auds = augmentedImageDatastore([227,227], imds); %Create augmentedImageDatastore
preds = classify(net, auds)
Processing Images in a Datastore: (3/3) Color preprocessing with augmented image datastores
augmentedImageDatastore可以对彩色图片进行处理
ls *.jpg
net = alexnet;
imds = imageDatastore('file*.jpg');
montage(imds); %Display images in imds
auds = augmentedImageDatastore([227,227], imds, 'ColorPreprocessing', 'gray2rgb') %Create augmentedImageDatastore
preds = classify(net, auds)
Create a Datastore Using Subfolders
net = alexnet;
flwrds = imageDatastore('Flowers', 'IncludeSubfolders',true);
preds = classify(net,flwrds)
四、transfer learn
1、原因
(1)原有NET不能解决有效自己的问题
(2)自己训练一个全新的网络--网络结构与随机权重,需要具有网络架构方面的知识和经验、大量的训练数据、大量的计算时间
2、Components Needed for Transfer Learning: (1/2) The components of transfer learning
3、 Preparing Training Data: (1/3) Labeling images
Label Images in a Datastore
load pathToImages
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true); %This code creates a datastore of 960 flower images.
flowernames = flwrds.Labels
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames') %Create datastore with labels
flowernames = flwrds.Labels %Extract new labels
Preparing Training Data: (2/3) Split data for training and testing
Split Data for Training and Testing
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code creates a datastore of 960 flower images.
load pathToImages
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames')
Task 1
Split datastore
[flwrTrain, flwrTest] = splitEachLabel(flwrds, 0.6)
Task 2
Split datastore randomly
[flwrTrain, flwrTest] = splitEachLabel(flwrds, 0.8, 'randomized')
Task 3
Split datastore by number of images
[flwrTrain, flwrTest] = splitEachLabel(flwrds,50)
Preparing Training Data: (3/3) Augmented training data
4、微调思路
(1)Recall that a feed-forward network is represented in MATLAB as an array of layers. This makes it easy to index into the layers of a network and change them.
(2)To modify a preexisting network, you create a new layer
(3)then index into the layer array that represents the network and overwrite the chosen layer with the newly created layer.
(4)As with any indexed assignment in MATLAB, you can combine these steps into one line.
Modifying Network Layers: (2/2) Modify layers of a pretrained network
Modify Network Layers
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code imports AlexNet and extracts its layers.
anet = alexnet;
layers = anet.Layers
Task 1
Create new layer
fc = fullyConnectedLayer(12)
Task 2
Replace 23rd layer
layers(23) = fc
Task 3
Replace last layer
layers(end) = classificationLayer
Setting Training Options
Set Training Options
Instructions are in the task pane to the left. Complete and submit each task one at a time.
Task 1
Set default options
opts = trainingOptions('sgdm');
Task 2
Set initial learning rate
opts = trainingOptions('sgdm','InitialLearnRate',0.001);
Training the Network: (4/4) Summary example
Transfer Learning Example Script
The code below implements transfer learning for the flower species example in this chapter. It is available as the script trainflowers.mlx in the course example files. You can download the course example files from the help menu in the top-right corner. You can find more information on this dataset at the 17 Category Flower Dataset page from the University of Oxford.
Note that this example can take some time to run if you run it on a computer that does not have a supported GPU.
Get training images
flower_ds = imageDatastore('Flowers','IncludeSubfolders',true,'LabelSource','foldernames');[trainImgs,testImgs] = splitEachLabel(flower_ds,0.6);numClasses = numel(categories(flower_ds.Labels));
Create a network by modifying AlexNet
net = alexnet;layers = net.Layers;layers(end-2) = fullyConnectedLayer(numClasses);layers(end) = classificationLayer;
Set training algorithm options
options = trainingOptions('sgdm','InitialLearnRate', 0.001);
Perform training
[flowernet,info] = trainNetwork(trainImgs, layers, options);
Use trained network to classify test images
testpreds = classify(flowernet,testImgs);
4.7 Evaluating Performance: (1/3) Evaluating training and test performance
Evaluate Performance
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code loads the training information of flowernet.
load pathToImages
load trainedFlowerNetwork flowernet info
Task 1
Plot training loss
plot(info.TrainingLoss)
This code creates a datastore of the flower images.
dsflowers = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames');
[trainImgs,testImgs] = splitEachLabel(dsflowers,0.98);
Task 2
Classify images
flwrPreds = classify(flowernet,testImgs)
Evaluating Performance: (2/3) Investigating test performance
Investigate test performance
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code sets up the Workspace for this activity.
load pathToImages.mat
pathToImages
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames');
[trainImgs,testImgs] = splitEachLabel(flwrds,0.98);
load trainedFlowerNetwork flwrPreds
Task 1
Extract labels
flwrActual = testImgs.Labels
Task 2
Count correct
numCorrect = nnz(flwrPreds == flwrActual)
Task 3
Calculate fraction correct
fracCorrect = numCorrect/numel(flwrPreds)
Task 4
Display confusion matrix
confusionchart(testImgs.Labels,flwrPreds)
Evaluating Performance: (3/3) Improving performance
MATLAB Course
Transfer Learning Summary
Transfer Learning Function Summary
Create a network
FunctionDescription
alexnetLoad pretrained network “AlexNet”
supported networksView list of available pretrained networks
fullyConnectedLayerCreate new fully connected network layer
classificationLayerCreate new output layer for a classification network
Get training images
FunctionDescription
imageDatastoreCreate datastore reference to image files
augmentedImageDatastorePreprocess a collection of image files
splitEachLabelDivide datastore into multiple datastores
Set training algorithm options
FunctionDescription
trainingOptionsCreate variable containing training algorithm options
Perform training
FunctionDescription
trainNetworkPerform training
Use trained network to perform classifications
FunctionDescription
classifyObtain trained network's classifications of input images
Evaluate trained network
FunctionDescription
nnzCount non-zero elements in an array
confusionchartCalculate confusion matrix
heatmapVisualize confusion matrix as a heatmap