jrosebr1/simple-keras-rest-api

Error particular to a case when I tunnel SSH to server on a remote server

previtus opened this issue ยท 10 comments

When the run_keras_server.py is deployed as a webservice to which you tunnel through SSH, I experienced this:
ERROR

raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("fc1000/Softmax:0", shape=(?, 1000), dtype=float32) is not an element of this graph.

(I tunnelled through SSH like this: ssh -N -f -L localhost:9999:localhost:5000 USERNAME@HOST)

SOLUTION
And after some search I found out that many people are experiencing the same issue at - keras-team/keras#2397

What helped was:
1.) adding graph = tf.get_default_graph() after we initialize the model, such as this:

def load_model():
	global model
	model = ResNet50(weights="imagenet")
	global graph
	graph = tf.get_default_graph()

2.) when working with the model, surrounding it by with graph.as_default():, such as

			with graph.as_default():
				preds = model.predict(image)
				#... etc

Not exactly sure what caused this problem, but perhaps it was something with different threads initializing and later using the model (as someone was saying in the referenced discussion).


Hope this helps someone who gets stuck on this.
Note that this ran perfectly all right on local machine, but started misbehaving when running Flask on remote server to which I tunnelled through SSH.

Ps: Thanks for the tutorial :)

Thanks for sharing @previtus! I'm not sure what may have caused that error but I'm happy you were able to resolve it :-)

@previtus. This solution really helped me a lot!! I was struggling for last two hours with ditto error without any solution till I find this post!!

Thanks a lot!

Thanks for sharing @previtus!

vj68 commented

You are a genius. I do not know how to thank you.

jrosebr1 - Why don't you update this code to blog and git ? This is such a mess.

@phpmind This issue does not effect all users (or even most users) and the root cause has not been determined. Furthermore, it also seems you are using the TensorFlow backend. Using the word "mess" is also frankly incorrect. The issue has now been well documented here on GitHub just in case any other people run into the problem.

@previtus thanks for the solution!
@jrosebr1 thanks for the wonderful resources, easily got started!

This works for the developmental server, but it does not work if you want to use a standalone wsgi container e.g. gunicorn, gevent, uwsgi.

Also there is another solution if you just want to get the model to work. Essentially just get rid of the load_model() function.

run_keras_server.txt

The above works on the standalone wsgi containers but not on the
Flask developmental server ???
Is there a solution that works for all cases?

This might be a good reference

@johncleveland -- refer to the link provided by @hasibzunair. It provides my instructions on how to modify the code to have it run on a WSGI server.