Can't access TensorFlow Adam optimizer namespace

前端 未结 1 931
抹茶落季
抹茶落季 2021-01-07 07:24

I\'m trying to learn about GANs and I\'m working through the example here.

The code below using the Adam optimizer gives me the error

\"Valu

相关标签:
1条回答
  • 2021-01-07 08:06

    Your ValueError is caused by creating new variables within the variable_scope.reuse==True.

    Variables are created by Adam, when you call the minimize function of Adam, for saving momentums of each trainable variables in your graph.

    Actually, the code "reuse=False" DOES NOT work as you expected. The reuse state cannot change back to False forever once you set it to True, and the reuse state will be inherited by its all sub scopes.

    with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:
        assert tf.get_variable_scope().reuse == True
    

    I guess you have set reuse to True somewhere before the post codes, thus the default variable_scope.reuse==True. Then you create a new variable_scope for Adam, however, new scope will inherit the reuse state of default scope. Then, Adam creates variable under state reuse==True, which raises an error.

    The solution is to add a sub scope under the graph's default scope when you set variable_scope.reuse=True, then the default scope.reuse is still False, and Adam.minimize will work.

    0 讨论(0)
提交回复
热议问题