Skip to content
Advertisement

Adam Optimizer Not Working on cost function

I wanted to make own neural network for MNIST data set and for that using tensorflow i am writing the code imported library and dataset then done one hot encoding and after all done the weights and baises assignment and then done the forward propagation with the random values and for back propagation and cost minimization used a loss function to do so but unable to use the optimizer dont know why

JavaScript

after running this giives a error like

JavaScript

after running this just simply provide variable 0 for all and not minimizes the cost and returns the randomly assigned values to the weights and variables

How to fix it?

Advertisement

Answer

You are not recording gradient, since you are calculating the prediction “before hand”, instead you want to allow the optimizer to “record” the operations, in other words, you want to compute the prediction inside the “loss” lambda that you are passing to the optimizer:

JavaScript

also, consider that in your code you are referring to a y variable that you have never defined, you probably meant y_train_onehot (which I’ve used in the above snipped)

the reason why this happens, is clearly explained in the minimize doc: https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/Optimizer#minimize

Advertisement