Closed
Description
There appears to be a bug in the make_predict_function for the tensorflow backend. The following error message appears for me when trying to call model.predict(...)
self._make_predict_function()
File "/usr/local/lib/python3.4/dist-packages/keras/engine/training.py", line 679, in _make_predict_function
**self._function_kwargs)
File "/usr/local/lib/python3.4/dist-packages/keras/backend/tensorflow_backend.py", line 615, in function
return Function(inputs, outputs, updates=updates)
File "/usr/local/lib/python3.4/dist-packages/keras/backend/tensorflow_backend.py", line 589, in __init__
with tf.control_dependencies(self.outputs):
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 3192, in control_dependencies
return get_default_graph().control_dependencies(control_inputs)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 2993, in control_dependencies
c = self.as_graph_element(c)
File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/framework/ops.py", line 2291, in as_graph_element
raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("Sigmoid_2:0", shape=(?, 17), dtype=float32) is not an element of this graph.
This does not happen when using the theano backend.
Notes: The model is loaded from json, and is defined as follows:
seq1=Input(dtype='int32',shape=(400,),name='input_text')
seq2=Input(dtype='int32',shape=(20,),name='input_titles')
embeddeding=Embedding(max_features,embedding_dims,dropout=0.3)
encoding_1=embeddeding(seq1)
encoding_2=embeddeding(seq2)
filter_lengths = [1,3,6]
def max_1d(X):
return K.max(X, axis=1)
convs1=[]
convs2=[]
for fl in filter_lengths:
conv1=Convolution1D(nb_filter=nb_filter,
filter_length=fl,
border_mode='valid',
activation='relu',
subsample_length=1)(encoding_1)
conv1=Lambda(max_1d, output_shape=(nb_filter,))(conv1)
convs1.append(conv1)
conv2=Convolution1D(nb_filter=nb_filter,
filter_length=fl,
border_mode='valid',
activation='relu',
subsample_length=1)(encoding_2)
conv2=Lambda(max_1d, output_shape=(nb_filter,))(conv2)
convs2.append(conv2)
m=merge([*convs1,*convs2],mode='concat')
m=Highway(activation='relu')(m)
m=Highway(activation='relu')(m)
m=Dropout(0.5)(m)
hovedkategori_loss=Dense(labsHovedKat.shape[1],activation='sigmoid',name='hovedkategori')(m)
m1=merge([hovedkategori_loss,m],mode='concat')
underkategori_loss=Dense(labsUnderKat.shape[1],activation='sigmoid',name='underkategori')(m1)
model=Model(input=[seq1,seq2],output=[hovedkategori_loss,underkategori_loss])
model.compile(optimizer='adam',loss='binary_crossentropy',metrics={'hovedkategori':'accuracy','underkategori':'accuracy'})
- Check that you are up-to-date with the master branch of Keras. You can update with:
pip install git+git://github.com/fchollet/keras.git --upgrade --no-depsIf running on Theano, check that you are up-to-date with the master branch of Theano. You can update with:
pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
Activity
Froskekongen commentedon Apr 19, 2016
I would appreciate any comments on this issue, as I want to deploy the model asap. And I need to know if I can use it or code something else.
fchollet commentedon Apr 19, 2016
Do you have a code snippet to reproduce this issue? I can guarantee you that
predict
does in fact work, including with TensorFlow.Froskekongen commentedon Apr 20, 2016
It appears this bug had nothing to do with either keras or tensorflow, but rather how async events were handled by the webserver I am using.
jstypka commentedon May 9, 2016
@Froskekongen could you describe how you fixed this in more detail? I'm having an exactly the same error however in a different program.
It seems to work when I do it manually in a REPL, however when I deploy it as a webservice it breaks.
pxlong commentedon May 13, 2016
I also have the same error under the tensorflow backend, however, it works using the theano backend.
@jstypka @Froskekongen Have you found a solution to fix it?
jstypka commentedon May 13, 2016
@pxlong it also works on Theano for me, I think it's exactly the same problem. I didn't manage to solve it though, was hoping for some hints from @Froskekongen
rkempter commentedon May 27, 2016
same here, same issue! Works fine in REPL, issues running it behind a webservice.
rkempter commentedon May 27, 2016
Running the webservice with gunicorn in sync mode solved the issue.
gladuo commentedon Aug 2, 2016
Hey everybody, I'm still not sure what's wrong with this combination.
But I use meinheld instead and it workes even better than gevent.
Hope this help.
AbhishekAshokDubey commentedon Aug 16, 2016
Same problem (model.predtict breaking) for me too, but it worked when i switched to theano backend from tensflow.
Nr90 commentedon Aug 26, 2016
Same problem here.
Seems to work fine normally. When deployed as a webservice using Flask, get this error.
Nr90 commentedon Aug 27, 2016
Works when using Theano as backend, doesn't work with tensorflow.
avital commentedon Oct 19, 2016
I had this problem when doing inference in a different thread than where I loaded my model. Here's how I fixed the problem:
Right after loading or constructing your model, save the TensorFlow graph:
In the other thread (or perhaps in an asynchronous event handler), do:
I learned about this from https://www.tensorflow.org/versions/r0.11/api_docs/python/framework.html#get_default_graph
Walid-Ahmed commentedon Nov 4, 2016
Thanks a lot.
it worked for me.
157 remaining items