“Could not interpret optimizer identifier” error in Keras

后端 未结 14 1507
别跟我提以往
别跟我提以往 2021-02-03 19:47

I got this error when I tried to modify the learning rate parameter of SGD optimizer in Keras. Did I miss something in my codes or my Keras was not installed properly?

相关标签:
14条回答
  • 2021-02-03 19:54

    I tried the following and it worked for me:

    from keras import optimizers

    sgd = optimizers.SGD(lr=0.01)

    model.compile(loss='mean_squared_error', optimizer=sgd)

    0 讨论(0)
  • 2021-02-03 19:58

    Running the Keras documentaion example https://keras.io/examples/cifar10_cnn/ and installing the latest keras and tensor flow versions

    (at the time of this writing tensorflow 2.0.0a0 and Keras version 2.2.4 )

    I had to import explicitly the optimizer the keras the example is using,specifically the line on top of the example :

    opt = tensorflow.keras.optimizers.rmsprop(lr=0.0001, decay=1e-6)
    

    was replaced by

    from tensorflow.keras.optimizers import RMSprop
    
    opt = RMSprop(lr=0.0001, decay=1e-6)
    

    In the recent version the api "broke" and keras.stuff in a lot of cases became tensorflow.keras.stuff.

    0 讨论(0)
  • 2021-02-03 20:00

    I am bit late here, Your issue is you have mixed Tensorflow keras and keras API in your code. The optimizer and the model should come from same layer definition. Use Keras API for everything as below:

    from keras.models import Sequential
    from keras.layers import Dense, Dropout, LSTM, BatchNormalization
    from keras.callbacks import TensorBoard
    from keras.callbacks import ModelCheckpoint
    from keras.optimizers import adam
    
    # Set Model
    model = Sequential()
    model.add(LSTM(128, input_shape=(train_x.shape[1:]), return_sequences=True))
    model.add(Dropout(0.2))
    model.add(BatchNormalization())
    
    # Set Optimizer
    opt = adam(lr=0.001, decay=1e-6)
    
    # Compile model
    model.compile(
        loss='sparse_categorical_crossentropy',
        optimizer=opt,
        metrics=['accuracy']
    )
    

    I have used adam in this example. Please use your relevant optimizer as per above code.

    Hope this helps.

    0 讨论(0)
  • 2021-02-03 20:00

    This problem is mainly caused due to different versions. The tensorflow.keras version may not be same as the keras. Thus causing the error as mentioned by @Priyanka.

    For me, whenever this error arises, I pass in the name of the optimizer as a string, and the backend figures it out. For example instead of

    tf.keras.optimizers.Adam
    

    or

    keras.optimizers.Adam
    

    I do

    model.compile(optimizer= 'adam' , loss= keras.losses.binary_crossentropy, metrics=['accuracy'])
    
    0 讨论(0)
  • 2021-02-03 20:00

    Just give

    optimizer = 'sgd' / 'RMSprop'
    
    0 讨论(0)
  • 2021-02-03 20:01

    In my case it was because I missed the parentheses. I am using tensorflow_addons so my code was like

    model.compile(optimizer=tfa.optimizers.LAMB, loss='binary_crossentropy',
                  metrics=['binary_accuracy'])
    

    And it gives

    ValueError: ('Could not interpret optimizer identifier:', <class tensorflow_addons.optimizers.lamb.LAMB'>)

    Then I changed my code into:

    model.compile(optimizer=tfa.optimizers.LAMB(), loss='binary_crossentropy',
                  metrics=['binary_accuracy'])
    

    and it works.

    0 讨论(0)
提交回复
热议问题