42
votes

I got this error when I tried to modify the learning rate parameter of SGD optimizer in Keras. Did I miss something in my codes or my Keras was not installed properly?

Here is my code:

from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, Flatten, GlobalAveragePooling2D, Activation
import keras
from keras.optimizers import SGD

model = Sequential()
model.add(Dense(64, kernel_initializer='uniform', input_shape=(10,)))
model.add(Activation('softmax'))
model.compile(loss='mean_squared_error', optimizer=SGD(lr=0.01), metrics= ['accuracy'])*

and here is the error message:

Traceback (most recent call last): File "C:\TensorFlow\Keras\ResNet-50\test_sgd.py", line 10, in model.compile(loss='mean_squared_error', optimizer=SGD(lr=0.01), metrics=['accuracy']) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\models.py", line 787, in compile **kwargs) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\engine\training.py", line 632, in compile self.optimizer = optimizers.get(optimizer) File "C:\Users\nsugiant\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\optimizers.py", line 788, in get raise ValueError('Could not interpret optimizer identifier:', identifier) ValueError: ('Could not interpret optimizer identifier:', )

15

15 Answers

57
votes

I recently faced similar problem.

The reason is you are using tensorflow.python.keras api for model and layers and keras.optimizers for SGD. They are two different keras versions of tensorflow and pure keras. They could not work together. You have to change everything to one version. Then it should work. :)

Hope this helps.

18
votes

I am bit late here, Your issue is you have mixed Tensorflow keras and keras API in your code. The optimizer and the model should come from same layer definition. Use Keras API for everything as below:

from keras.models import Sequential
from keras.layers import Dense, Dropout, LSTM, BatchNormalization
from keras.callbacks import TensorBoard
from keras.callbacks import ModelCheckpoint
from keras.optimizers import adam

# Set Model
model = Sequential()
model.add(LSTM(128, input_shape=(train_x.shape[1:]), return_sequences=True))
model.add(Dropout(0.2))
model.add(BatchNormalization())

# Set Optimizer
opt = adam(lr=0.001, decay=1e-6)

# Compile model
model.compile(
    loss='sparse_categorical_crossentropy',
    optimizer=opt,
    metrics=['accuracy']
)

I have used adam in this example. Please use your relevant optimizer as per above code.

Hope this helps.

10
votes

This problem is mainly caused due to different versions. The tensorflow.keras version may not be same as the keras. Thus causing the error as mentioned by @Priyanka.

For me, whenever this error arises, I pass in the name of the optimizer as a string, and the backend figures it out. For example instead of

tf.keras.optimizers.Adam

or

keras.optimizers.Adam

I do

model.compile(optimizer= 'adam' , loss= keras.losses.binary_crossentropy, metrics=['accuracy'])
5
votes
from tensorflow.keras.optimizers import SGD

This works well.

Since Tensorflow 2.0, there is a new API available directly via tensorflow:

Solution works for tensorflow==2.2.0rc2, Keras==2.2.4 (on Win10)

Please also note that the version above uses learning_rate as parameter and no longer lr.

4
votes

For some libraries (e.g. keras_radam) you'll need to set up an environment variable before the import:

import os
os.environ['TF_KERAS'] = '1'

import tensorflow
import your_library
3
votes

Running the Keras documentaion example https://keras.io/examples/cifar10_cnn/ and installing the latest keras and tensor flow versions

(at the time of this writing tensorflow 2.0.0a0 and Keras version 2.2.4 )

I had to import explicitly the optimizer the keras the example is using,specifically the line on top of the example :

opt = tensorflow.keras.optimizers.rmsprop(lr=0.0001, decay=1e-6)

was replaced by

from tensorflow.keras.optimizers import RMSprop

opt = RMSprop(lr=0.0001, decay=1e-6)

In the recent version the api "broke" and keras.stuff in a lot of cases became tensorflow.keras.stuff.

2
votes

I tried the following and it worked for me:

from keras import optimizers

sgd = optimizers.SGD(lr=0.01)

model.compile(loss='mean_squared_error', optimizer=sgd)

2
votes

In my case it was because I missed the parentheses. I am using tensorflow_addons so my code was like

model.compile(optimizer=tfa.optimizers.LAMB, loss='binary_crossentropy',
              metrics=['binary_accuracy'])

And it gives

ValueError: ('Could not interpret optimizer identifier:', <class tensorflow_addons.optimizers.lamb.LAMB'>)

Then I changed my code into:

model.compile(optimizer=tfa.optimizers.LAMB(), loss='binary_crossentropy',
              metrics=['binary_accuracy'])

and it works.

2
votes

Use one style in one kernel, try not to mix

from keras.optimizers import sth

with

from tensorflow.keras.optimizers import sth

1
votes

use

from tensorflow.keras import optimizers

instead of

from keras import optimizers

0
votes

Try changing your import lines to

from keras.models import Sequential
from keras.layers import Dense, ...

Your imports seem a little strange to me. Maybe you could elaborate more on that.

0
votes

Just give

optimizer = 'sgd' / 'RMSprop'
0
votes

I have misplaced parenthesis and got this error,

Initially it was

x=Conv2D(filters[0],(3,3),use_bias=False,padding="same",kernel_regularizer=l2(reg),x))

The corrected version was

x=Conv2D(filters[0],(3,3),use_bias=False,padding="same",kernel_regularizer=l2(reg))(x)
0
votes

I got the same error message and resolved this issue, in my case, by replacing the assignment of optimizer:

optimizer=keras.optimizers.Adam

with its instance instead of the class itself:

optimizer=keras.optimizers.Adam()
0
votes

I tried everything in this thread to fix it but they didn't work. However, I managed to fix it for me. For me, the issue was that calling the optimizer class, ie. tensorflow.keras.optimizers.Adam caused the error, but calling the optimizer as a function, ie. tensorflow.keras.optimizers.Adam() worked. So my code looks like:

model.compile(
    loss=tensorflow.keras.losses.categorical_crossentropy(),
    optimizer=tensorflow.keras.optimizers.Adam()
)

Looking at the tensorflow github, I am not the only one with this error where calling the function rather than the class fixed the error.