Keras(五)——Layers大纲

Layers

Common Methods

  • layer.get_weights(): 以含有Numpy矩阵的列表形式返回层的权重。
  • layer.set_weights(weights): 从含有Numpy矩阵的列表中设置层的权重(与get_weights的输出形状相同)。
  • layer.get_config(): 返回包含层配置的字典。

如果一个层具有单节点,得到各项参数:

  • layer.input
  • layer.output
  • layer.input_shape
  • layer.output_shape

如果为多节点:

  • layer.get_input_at(node_index)
  • layer.get_output_at(node_index)
  • layer.get_input_shape_at(node_index)
  • layer.get_output_shape_at(node_index)

Core Layers(网络核心层)

Dense—— densely-connect NN layer

Activation

Dropout

Flatten——flattens the input

Input

Reshape

Permute ——permutes the dimensions of the input according to a given pattern

RepeatVector

Lambda—— warps arbitrary expression as a Layer object

ActivityRegularization—— applies an update to the cost function based input activity

Masking——skip the timesteps

SpatialDropout1D

SpatialDropout2D

SpatialDropout3D

Convolutional Layers(卷积层)

Conv1D

Conv2D

SeparableConv1D(可分离)

SeparableConv2D

DepthwiseConv2D

Conv2DTranspose(反卷积)

Conv3D

Conv3DTranspose

Cropping1D(裁剪层)

Cropping2D

Cropping3D

UpSampling1D(上采样层)

UpSampling2D

UpSampling3D

ZeroPadding1D(零填充层)

ZeroPadding2D

ZeroPadding3D

Pooling Layers(池化层)

MaxPooling1D

MaxPooling2D

MaxPooling3D

AveragePooling1D

AveragePooling2D

AveragePooling3D

GlobalMaxPooling1D

GlobalAveragePooling1D

GlobalMaxPooling2D

GlobalAveragePooling2D

GlobalMaxPooling3D

GlobalAveragePooling3D

Locally-connected Layers(局部连接层)

LocallyConnected1D

LocallyConnected2D

Recurrent Layers(循环层)

RNN

SimpleRNN

GRU (Gated Recurrent Unit)

LSTM (Long Short-Term Memory)

ConvLSTM2D

SimpleRNNCell

GRUCell

LSTMCell

CuDNNGRU—— Fast GRU implementation backed by CuDNN(only on GPU)

CuDNNLSTM

Embedding Layers(嵌入层)

Embedding—— Turns positive integers (indexes) into dense vectors of fixed size.

Merge Layers(融合层)

Add

Subtract

Multiply

Average

Maximum

Concatenate

Dot(点积)

add

subtract

multiply

average

maximum

concatenate

dot

Advanced Activations Layers(高级激活层)

LeakyReLU( Leaky version of a Rectified Linear Unit )

PReLU( Parametric Rectified Linear Unit )

ELU( Exponential Linear Unit )

ThresholdedReLU( Thresholded Rectified Linear Unit )

Softmax

ReLU( Rectified Linear Unit )

Normalization Layers(标准化层)

BatchNormalization

Noise Layers

GaussianNoise

GaussianDropout

AlphaDropout

Layer warppers(封装器)

TimeDistributed

Bidirectional

你可能感兴趣的:(机器学习)