Torch7入门续集补充--- nngraph包的使用
http://blog.csdn.net/hungryof/article/details/72902169
faster-rcnn.torch
https://github.com/andreaskoepf/faster-rcnn.torch/blob/master/models/model_utilities.lua
https://github.com/torch/nngraph/blob/master/README.md
直接上源码:
原本我们使用sequential,创建无分枝的网络结构是这样的。
require 'loadcaffe'
local cnn = loadcaffe.load('VGG_ILSVRC_19_layers_deploy_5.prototxt','VGG_ILSVRC_19_layers_5.caffemodel','nn'):float()
net = nn.Sequential()
for i = 1,3 do
net:add(cnn:get(i))
end
print(net:get(1).weight)
require 'nn'
require 'nngraph'
require 'loadcaffe'
local cnn = loadcaffe.load('VGG_ILSVRC_19_layers_deploy_5.prototxt','VGG_ILSVRC_19_layers_5.caffemodel','nn'):float()
local h1 = nn.Identity()()
net_same = h1 - cnn:get(1) - cnn:get(2)
net_1 = net_same - cnn:get(3) - cnn:get(4)
net_2 = net_same - cnn:get(3) - cnn:get(4)
gmod = nn.gModule({h1},{net_1,net_2})
print(gmod:get(2))
如果想把他们合并起来是:out = {out[1], our[2]}
其他都一样backward,updateGradInput等随便用就OK
如果input有两个,就是这样的。
h1 = - nn.Linear(20,20)
h2 = - nn.Linear(10,10)
hh1 = h1 - nn.Tanh() - nn.Linear(20,1)
hh2 = h2 - nn.Tanh() - nn.Linear(10,1)
madd = {hh1,hh2} - nn.CAddTable()
oA = madd - nn.Sigmoid()
oB = madd - nn.Tanh()
gmod = nn.gModule( {h1,h2}, {oA,oB} )