ImportError: No module named Keras

有个TensorFlow项目,调用Keras的https://keras.io/zh/layers/advanced-activations/

LeakyReLU

keras.layers.LeakyReLU(alpha=0.3)

带泄漏的 ReLU。

当神经元未激活时,它仍允许赋予一个很小的梯度: f(x) = alpha * x for x < 0f(x) = x for x >= 0.

输入尺寸

可以是任意的。如果将该层作为模型的第一层, 则需要指定 input_shape 参数 (整数元组,不包含样本数量的维度)。

输出尺寸

与输入相同。

参数

  • alpha: float >= 0。负斜率系数。

然后我想去调用Keras底层代码,点进去发现又是层层调用啊。

https://github.com/keras-team/keras/blob/master/keras/layers/advanced_activations.py#L19

class LeakyReLU(Layer):
    """Leaky version of a Rectified Linear Unit.
    It allows a small gradient when the unit is not active:
    `f(x) = alpha * x for x < 0`,
    `f(x) = x for x >= 0`.
    # Input shape
        Arbitrary. Use the keyword argument `input_shape`
        (tuple of integers, does not include the samples axis)
        when using this layer as the first layer in a model.
    # Output shape
        Same shape as the input.
    # Arguments
        alpha: float >= 0. Negative slope coefficient.
    # References
        - [Rectifier Nonlinearities Improve Neural Network Acoustic Models](
           https://ai.stanford.edu/~amaas/papers/relu_hybrid_icml2013_final.pdf)
    """

    def __init__(self, alpha=0.3, **kwargs):
        super(LeakyReLU, self).__init__(**kwargs)
        self.supports_masking = True
        self.alpha = K.cast_to_floatx(alpha)

    def call(self, inputs):
        return K.relu(inputs, alpha=self.alpha)

    def get_config(self):
        config = {'alpha': float(self.alpha)}
        base_config = super(LeakyReLU, self).get_config()
        return dict(list(base_config.items()) + list(config.items()))

    def compute_output_shape(self, input_shape):
        return input_shape

于是放弃了,直接装keras吧。

pip install keras

检验一下确实安装成功了

ImportError: No module named Keras_第1张图片

 但是项目里面还是,报错importerror。

然后在项目里加上 import Keras

结果还是错,发现k 和K 混乱了,改为小写k即可。

你可能感兴趣的:(TensorFlow)