[python] Tensorflow: Using Adam optimizer

I am experimenting with some simple models in tensorflow, including one that looks very similar to the first MNIST for ML Beginners example, but with a somewhat larger dimensionality. I am able to use the gradient descent optimizer with no problems, getting good enough convergence. When I try to use the ADAM optimizer, I get errors like this:

tensorflow.python.framework.errors.FailedPreconditionError: Attempting to use uninitialized value Variable_21/Adam
     [[Node: Adam_2/update_Variable_21/ApplyAdam = ApplyAdam[T=DT_FLOAT, use_locking=false, _device="/job:localhost/replica:0/task:0/cpu:0"](Variable_21, Variable_21/Adam, Variable_21/Adam_1, beta1_power_2, beta2_power_2, Adam_2/learning_rate, Adam_2/beta1, Adam_2/beta2, Adam_2/epsilon, gradients_11/add_10_grad/tuple/control_dependency_1)]]

where the specific variable that complains about being uninitialized changes depending on the run. What does this error mean? And what does it suggest is wrong? It seems to occur regardless of the learning rate I use.

This question is related to python tensorflow

The answer is


FailedPreconditionError: Attempting to use uninitialized value is one of the most frequent errors related to tensorflow. From official documentation, FailedPreconditionError

This exception is most commonly raised when running an operation that reads a tf.Variable before it has been initialized.

In your case the error even explains what variable was not initialized: Attempting to use uninitialized value Variable_1. One of the TF tutorials explains a lot about variables, their creation/initialization/saving/loading

Basically to initialize the variable you have 3 options:

I almost always use the first approach. Remember you should put it inside a session run. So you will get something like this:

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())

If your are curious about more information about variables, read this documentation to know how to report_uninitialized_variables and check is_variable_initialized.


I was having a similar problem. (No problems training with GradientDescent optimizer, but error raised when using to Adam Optimizer, or any other optimizer with its own variables)

Changing to an interactive session solved this problem for me.

sess = tf.Session()

into

sess = tf.InteractiveSession()

You need to call tf.global_variables_initializer() on you session, like

init = tf.global_variables_initializer()
sess.run(init)

Full example is available in this great tutorial https://www.tensorflow.org/get_started/mnist/mechanics


run init after AdamOptimizer,and without define init before or run init

sess.run(tf.initialize_all_variables())

or

sess.run(tf.global_variables_initializer())