Notes on MatConvNet(II):vl_simplenn

#Written before
This blog is the second one of the series of Notes on MatConvNet.

  1. Notes on MatConvNet(I) – Overview
    Here I will mainly introduce the central of the matcovnet—vl_simplenn.
    This function plays a quite important role in forward-propagation and backward-propagation.
    PS. I have to admit that writing blogs in Chinese would bring much more traffic than writing English blogs. Yet, if I care too much about vain trifles, it is a sad thing.

Something that should be known before

I only introduce BP. How does the derivatives propagate backforward? I sincerely recommend you to read BP Algorithm. This blog is a good beginning of BP. When you have finished reading that, The first blog of this series is also recommended. I will repeat the main computation structure illustrated in Notes(I).

Computation Structure

这里写图片描述
I make some default rules here.

  • y always represents the output of certain layer. That is to say when it comes to layer i, then y is the output of layer i.
  • z always represents the output of the whole net, or rather, it represents the output of the final layer n.
  • x represents the input of certain layer.

In order to make things more easier, matcovnet takes one simple function as a “layer”. This means when the input goes through a computation strcuture(no matter it is conv structure or just a relu structure), it does computation like the following:
d z d x = f ′ ( x ) ∗ d z d y ( 1 ) \frac{dz}{dx}=f^{'}(x)*\frac{dz}{dy}\qquad(1) dxdz=f(x)d

你可能感兴趣的:(Deep,Learning,Matlab,matconvnet,CNN,框架)