tf.nn.relu与tf.nn.relu_layer

1. tf.nn.relu激活函数不必多数:

传入tf.nn.relu()的参数,是:tensor和weights卷积后+biases

 

2. tf.nn.relu_layer():

def relu_layer(x, weights, biases, name=None):
   """Computes Relu(x * weight + biases)."""

传入tf.nn.relu_layer()的参数x, weights, biases,

activation = tf.nn.relu_layer(x, weights, biases, name=None)

等价于:

activation = tf.nn.relu(tf.matmul(x, weights) + biases, name=None)

你可能感兴趣的:(Python,TensorFlow)