Tensorflow layers.fully_connected 参数(自用)

  def fully_connected(inputs,
                    num_outputs,
                    activation_fn=nn.relu,
                    normalizer_fn=None,
                    normalizer_params=None,
                    weights_initializer=initializers.xavier_initializer(),
                    weights_regularizer=None,
                    biases_initializer=init_ops.zeros_initializer(),
                    biases_regularizer=None,
                    reuse=None,
                    variables_collections=None,
                    outputs_collections=None,
                    trainable=True,

                    scope=None):

  • inputs: A tensor of at least rank 2 and static value for the last dimension; i.e. `[batch_size, depth]`, `[None, None, None, channels]`.
  • 至少有两层张量且最后一维是静态值。例如[batch_size,depth],[None, None, None, channels].
  • num_outputs: Integer or long, the number of output units in the layer.
  • int型整数或者是long型,是层中输出单元的个数
  • activation_fn: Activation function. The default value is a ReLU function.Explicitly set it to None to skip it and maintain a linear activation.
  • 激活函数,默认值是ReLU函数,如果将它设为None则会跳过且保持线性激活
  • normalizer_fn: Normalization function to use instead of `biases`. If `normalizer_fn` is provided then `biases_initializer` and
  • `biases_regularizer` are ignored and `biases` are not created nor added.default set to None for no normalizer function 
  • 正则化函数用来代替偏置,如果设置了正则化函数,则biases_initializer和biases_regularizer将被忽略且biases不会被创建。
  • 默认设置None,不设置正则化函数
  • normalizer_params: Normalization function parameters.
  • 正则化函数参数
  • weights_initializer: An initializer for the weights.
  • 权重初始化
  • weights_regularizer: Optional regularizer for the weights.
  • 对权重的可选的正则化项,正则化:https://www.zhihu.com/question/20924039
  • biases_initializer: An initializer for the biases. If None skip biases.
  • 偏置初始化,如果为None则跳过
  • biases_regularizer: Optional regularizer for the biases.
  • 对偏置可选的正则化项
  • reuse: Whether or not the layer and its variables should be reused. To be able to reuse the layer scope must be given.
  • 是否应该重用该层及其变量,能够重新使用的层的范围一定是确定的
  • variables_collections: Optional list of collections for all the variables or a dictionary containing a different list of collections per variable.
  • 可选的所有变量的集合列表或包含每个变量集合的不同列表的字典。 
  • outputs_collections: Collection to add the outputs.收集去添加到输出
  • trainable: If `True` also add variables to the graph collection `GraphKeys.TRAINABLE_VARIABLES` (see tf.Variable).如果我们是微调网络,有时候需要冻结某一层的参数,则设置为False。
  • scope: Optional scope for variable_scope.可变范围的可选范围 

你可能感兴趣的:(机器学习)