孪生网络(1)_孪生网络的分类

孪生网络

全文参考https://blog.csdn.net/qq_35826213/article/details/86313469

​ 孪生网络有两种,一种是不共享参数的孪生网络,另一种是共享参数的孪生网络,

不共享参数的孪生网络

from keras.layers import merge,Conv2D,MaxPool2D,Activation,Dense,concatenate,Flatten
from keras.layers import Input
from keras.models import Model

def FeatureNetwork():
    """特征提取网络"""
    inp=Input(shape=(28,28,1),name="FeatureNet_ImageInput")
    models=Conv2D(filters=24,kernel_size=(3,3),strides=1,padding='same')(inp)
    models=Activation('relu')(models)
    models=MaxPool2D(pool_size=(3,3))(models)

    models=Conv2D(filters=64,kernel_size=(3,3),strides=1,padding='same')(models)
    models=Activation('relu')(models)

    models=Conv2D(filters=96,kernel_size=(3,3),strides=1,padding='valid')(models)
    models=Activation('relu')(models)

    models = Conv2D(filters=96, kernel_size=(3, 3), strides=1, padding='valid')(models)
    models = Activation('relu')(models)
    models =Flatten()(models)
    models =Dense(512)(models)
    models=Activation('relu')(models)
    model=Model(inputs=inp,outputs=models)
    return model
#此网络为不共享参数的孪生网络,实际是两个网络模型
def ClassiFilterNet():
    #孪生网络中一个特征提取
    input1=FeatureNetwork()
    #孪生网络中另一个特征提取
    input2=FeatureNetwork()
    for layer in input2.layers:
        layer.name=layer.name+str("_2")
    inp1=input1.input
    inp2=input2.input

    #融合网络,其实就是简单地相加
    merget_layers=concatenate([input1.output,input2.output])
    fc1=Dense(1024,activation='relu')(merget_layers)
    fc2=Dense(1024,activation='relu')(fc1)
    fc3=Dense(2,activation='softmax')(fc2)

    class_models=Model(inputs=[inp1,inp2],outputs=[fc3])
    return class_models
net=ClassiFilterNet()
net.summary()

上面的代码是不共享参数的神经网络模型,每个输入都会训练自己的网络模型,训练之后会把两个神经网络再组合,生成一个神经网络,网络图如下

img

不共享网络参数的参数

dense_3 (Dense) (None, 1024) 1049600 concatenate_1[0][0]


dense_4 (Dense) (None, 1024) 1049600 dense_3[0][0]


dense_5 (Dense) (None, 2) 2050 dense_4[0][0]

Total params: 4,864,994
Trainable params: 4,864,994

去掉最后全连接层的网络后,参数大小为2763744

2.共享参数的神经网络

代码如下

#!/usr/bin/env python
# -*- coding: utf-8 -*-
from keras.models import Sequential
from keras.layers import merge,Conv2D,MaxPool2D,Activation,Dense,concatenate,Flatten
from keras.layers import Input
from keras.models import Model
from keras.utils import np_utils
import tensorflow as tf
import keras
from keras.datasets import mnist
import numpy as np
from keras.utils import np_utils

#此网络为不共享参数的孪生网络,实际是两个网络模型
def ClassiFilterNet(reuse=False):

    inp = Input(shape=(28, 28, 1), name="FeatureNet_ImageInput")
    models = Conv2D(filters=24, kernel_size=(3, 3), strides=1, padding='same')(inp)
    models = Activation('relu')(models)
    models = MaxPool2D(pool_size=(3, 3))(models)

    models = Conv2D(filters=64, kernel_size=(3, 3), strides=1, padding='same')(models)
    models = Activation('relu')(models)

    models = Conv2D(filters=96, kernel_size=(3, 3), strides=1, padding='valid')(models)
    models = Activation('relu')(models)

    models = Conv2D(filters=96, kernel_size=(3, 3), strides=1, padding='valid')(models)
    models = Activation('relu')(models)
    models = Flatten()(models)
    models = Dense(512)(models)
    models = Activation('relu')(models)
    model = Model(inputs=inp, outputs=models)

    inp1=Input(shape=(28,28,1))
    inp2=Input(shape=(28,28,1))
    model1=model(inp1)
    model2=model(inp2)
    merge=concatenate([model1,model2])

    fc1=Dense(1024,activation='relu')(merge)
    fc2=Dense(1024,activation='relu')(fc1)
    fc3=Dense(2,activation='softmax')(fc2)

    class_models=Model(inputs=[inp1,inp2],outputs=[fc3])
    return class_models
net=ClassiFilterNet()
net.summary()

可以看到共享权重的神经网络是两个输入共享一套网络模型,使用同一套网络,最后合并输出,然后结合全连接层,

img

可以看到这个网络是共享了权重的,参数数量一下就减少了

参数数量如下


dense_2 (Dense) (None, 1024) 1049600 concatenate_1[0][0]


dense_3 (Dense) (None, 1024) 1049600 dense_2[0][0]


dense_4 (Dense) (None, 2) 2050 dense_3[0][0]

Total params: 3,483,122
Trainable params: 3,483,122

去除全连接层后的网络参数为

​ 1381872,可以看到是2763744的二分之一

你可能感兴趣的:(图像,深度学习)