-
Notifications
You must be signed in to change notification settings - Fork 74.7k
Can't declare tf.Variable in @tf.function decorated function #26812
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
see this and rewrite your code to
or
|
HI @zakizhou ! I'm aware that defining a function that declares a variable inside creates a status and it has to be handled differently. As you suggested an alternative I to wrap the function in a class and make the variable a private attribute, or declaring the variable as global and check if it is However, as stated in the RFC:
Thus, when I invoke Instead, what happens is that even though I call the function only once, and thus this is the first call, it seems like |
tf.function may evaluate your python function more than once. What the RFC states instead is that you are allowed to create variables as long as variable creation only happens the first time your python function is evaluated. |
And the reason for this is that if you write python code which unconditionally creates variables then I can't tell whether you mean to create a new variable every time the python function is called (eager behavior) or reuse the existing variables (what happens in graph tf 1.x) so an error felt safer. |
Thank you for the explanation @alextp - now everything is clearer. The RFC just describes the first call, but there is no guarantee that tf.function won't call the function more than once while converting it as a graph and for this reason, the developer should take care of handling the variables creation; thus If I need to reuse or not a variable it makes no difference, I have to take care of its status manually. |
Yes, exactly. With TF 2.0 in general we're trying to get TF out of your way
as much as possible, instead of pushing every possible thing you'd want to
do inside TF. We think this is simpler, more modular, and easier to
integrate with larger codebases.
…On Wed, Mar 20, 2019 at 10:29 AM Paolo Galeone ***@***.***> wrote:
Thank you for the explanation @alextp <https://github.com/alextp> - now
everything is clearer. The RFC just describes the first call, but there is
no guarantee that tf.function won't call the function more than once while
converting it as a graph and for this reason, the developer should take
care of handling the variables creation; thus If I need to reuse or not a
variable it makes no difference, I have to take care of its status manually.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#26812 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAATxSQYBidgnZS8sTPjSbwXUWcZqKmLks5vYm_rgaJpZM4b4pkS>
.
--
- Alex
|
I have an exact issue like this, but with using different optimizers:
On a first epoch, I have With this, I have a huge error trace on the From my understanding, since the function is always called with different arguments, and there are no external dependencies, this error should not be happening. |
@ericpts this is an interesting use case I hadn't thought of. As a workaround I recommend you use two instances of tf.function here. Let's open another issue to discuss this? |
One alternative workaround is to make the boolean parameter a tensor; then
TF will trace both sides of the conditional once and pre-create all
variables.
…On Mon, Mar 25, 2019 at 4:11 AM Alexandre Passos ***@***.***> wrote:
@ericpts <https://github.com/ericpts> this is an interesting use case I
hadn't thought of.
As a workaround I recommend you use two instances of tf.function here.
Let's open another issue to discuss this?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#26812 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAATxWcd8Q8-bfQrUYBDZzMCgTVMmbE_ks5vaKDWgaJpZM4b4pkS>
.
--
- Alex
|
Hello, Is there a workaround for a situation like this. Here I need to create several w, b variables using add_layer method. But I get the variable reused error. def add_layer(input, c_kdimension, c_kstrides): @tf.function |
Hello @galeone , |
Sorry, this is wrong, do not use it. This disables the graph construction and runs everything eagerly, like numpy. This can be wanted for debugging or in certain cases, but it is surely not the workaround. @praveen-14 read again above about creating them as class variables or in the global scope. |
System information
Describe the current behavior
A function that correctly works in eager execution can't be decorated with
@tf.function
if declares atf.Variable
in the function body.The error message, reported below, is misleading since it talks about a non-first invocation when the function is invoked only once.
Describe the expected behavior
Calling a function decorated with the
@tf.function
should produce the same output as the same function without the decoration.Code to reproduce the issue
import tensorflow as tf
The text was updated successfully, but these errors were encountered: