嗨各位大佬好,我是小菜鸟。
tf中关于dense有几个函数(对象),比如说下面的,但其实是一样的,只不过在tf2中只能用后者
tf.contrib.layers.fully_connected
tf.layers.dense
tf.keras.layers.Dense
For Recommendation in Deep learning QQ Group 102948747
For Visual in deep learning QQ Group 629530787
I'm here waiting for you
不接受这个网页的私聊/私信!!
本宝宝长期征集真实情感经历(发在我公号:美好时光与你同行),长期接受付费咨询(啥问题都可),付费改代码。
>>> help(tf.layers.dense)
dense(inputs, units, activation=None, use_bias=True, kernel_initializer=None, bias_initializer=, kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None, trainable=True, name=None, reuse=None)
Functional interface for the densely-connected layer. (deprecated)
Use keras.layers.Dense instead.
This layer implements the operation:
`outputs = activation(inputs * kernel + bias)`
where `activation` is the activation function passed as the `activation`
argument (if not `None`), `kernel` is a weights matrix created by the layer,
and `bias` is a bias vector created by the layer
(only if `use_bias` is `True`).
Arguments:
inputs: Tensor input.
units: Integer or Long, dimensionality of the output space.
activation: Activation function (callable). Set it to None to maintain a
linear activation.
use_bias: Boolean, whether the layer uses a bias.
kernel_initializer: Initializer function for the weight matrix.
If `None` (default), weights are initialized using the default
initializer used by `tf.compat.v1.get_variable`.
bias_initializer: Initializer function for the bias.
kernel_regularizer: Regularizer function for the weight matrix.
bias_regularizer: Regularizer function for the bias.
activity_regularizer: Regularizer function for the output.
kernel_constraint: An optional projection function to be applied to the
kernel after being updated by an `Optimizer` (e.g. used to implement
norm constraints or value constraints for layer weights). The function
must take as input the unprojected variable and must return the
>>> help(tf.keras.layers.Dense)
class Dense(tensorflow.python.keras.engine.base_layer.Layer)
| Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None, **kwargs)
|
| Just your regular densely-connected NN layer.
那么可以将2替换为3,在分布式计算中dense中有参数要进行初始化,不能放在call下,一定要放在init中,这也是tf.AUTO_REUSE的意义。