吴恩达深度学习tensorflow版本问题

1.module 'tensorflow' has no attribute 'global_variables_initializer' 问题解决:在import后面加代码

tf.compat.v1.disable_eager_execution()

然后将源代码改为:

y_hat = tf.constant(36, name='y_hat')            # Define y_hat constant. Set to 36.
y = tf.constant(39, name='y')                    # Define y. Set to 39

loss = tf.Variable((y - y_hat)**2, name='loss')  # Create a variable for the loss

init = tf.compat.v1.global_variables_initializer()   # When init is run later (session.run(init)),
                                                 # the loss variable will be initialized and ready to be computed
with tf.compat.v1.Session() as session:                    # Create a session and print the output
    session.run(init)                            # Initializes the variables
    print(session.run(loss))                     # Prints the loss

2.RuntimeError: The Session graph is empty. Add operations to the graph before calling run()解决方法:

with tf.compat.v1.Session() as session:  

3.placeholder占位符问题,同上,加入compat.v1即可解决:

# Change the value of x in the feed_dict

x = tf.compat.v1.placeholder(tf.int64, name = 'x')
print(sess.run(2 * x, feed_dict = {x: 3}))
sess.close()

4.tf.contrib.layers.xavier_initializer初始化问题:在网上找到两种解决方法:

一是替换代码为

initializer=tf.keras.initializers.GlorotUniform(seed = 0)

二是代码替换为:

initializer=tf.truncated_normal_initializer(stddev=1.0, seed = 0))

5.tf.contrib.layers.flatten()问题:

tf.compat.v1.layers.flatten()

6.tensorflow2.0中全连接层问题:

tf.compat.v1.layers.dense(P2, 6)

你可能感兴趣的:(tensorflow,深度学习,机器学习)