TypeError: ValueError('Tensor Tensor("dense_2/Softmax:0", shape=(?, 2), dtype=float32) is not an ele

解决办法:

 

config = tf.ConfigProto()
#config.gpu_options.per_process_gpu_memory_fraction = 0.15
tf.Session(config=config)
global model
model = VGG16(weights='imagenet', input_shape=(224, 224, 3), pooling='max', include_top=False)
global graph
graph = tf.get_default_graph()

 

I had this problem when doing inference in a different thread than where I loaded my model. Here's how I fixed the problem:

Right after loading or constructing your model, save the TensorFlow graph:

graph = tf.get_default_graph()

In the other thread (or perhaps in an asynchronous event handler), do:

global graph
with graph.as_default():
    (... do inference here ...)

你可能感兴趣的:(tensorflow)