Convert sklearn.svm SVC classifier to Keras implementation

前端 未结 2 1078
难免孤独
难免孤独 2021-01-01 19:35

I\'m trying to convert some old code from using sklearn to Keras implementation. Since it is crucial to maintain the same way of operation, I want to understand if I\'m doin

相关标签:
2条回答
  • 2021-01-01 20:33

    If you are using Keras 2.0 then you need to change the following lines of anand v sing's answer.

    W_regularizer -> kernel_regularizer

    Github link

    model.add(Dense(nb_classes, kernel_regularizer=regularizers.l2(0.0001)))
    model.add(Activation('linear'))
    model.compile(loss='squared_hinge',
                          optimizer='adadelta', metrics=['accuracy'])
    

    Or You can use follow

    top_model = bottom_model.output
      top_model = Flatten()(top_model)
      top_model = Dropout(0.5)(top_model)
      top_model = Dense(64, activation='relu')(top_model)
      top_model = Dense(2, kernel_regularizer=l2(0.0001))(top_model)
      top_model = Activation('linear')(top_model)
      
      model = Model(bottom_model.input, top_model)
      model.compile(loss='squared_hinge',
                          optimizer='adadelta', metrics=['accuracy'])
      
    
    
    0 讨论(0)
  • 2021-01-01 20:36

    If you are making a classifier, you need squared_hinge and regularizer, to get the complete SVM loss function as can be seen here. So you will also need to break your last layer to add regularization parameter before performing activation, I have added the code here.

    These changes should give you the output

    from keras.regularizers import l2
    from keras.models import Sequential
    from keras.layers import Dense
    
    model = Sequential()
    model.add(Dense(64, activation='relu'))
    model.add(Dense(1), W_regularizer=l2(0.01))
    model.add(activation('softmax'))
    model.compile(loss='squared_hinge',
                  optimizer='adadelta',
                  metrics=['accuracy'])
    model.fit(X, Y_labels)
    
    

    Also hinge is implemented in keras for binary classification, so if you are working on a binary classification model, use the code below.

    from keras.regularizers import l2
    from keras.models import Sequential
    from keras.layers import Dense
    
    model = Sequential()
    model.add(Dense(64, activation='relu'))
    model.add(Dense(1), W_regularizer=l2(0.01))
    model.add(activation('linear'))
    model.compile(loss='hinge',
                  optimizer='adadelta',
                  metrics=['accuracy'])
    model.fit(X, Y_labels)
    
    

    If you cannot understand the article or have issues with the code, feel free to comment. I had this same issue a while back, and this GitHub thread helped me understand, maybe go through it too, some of the ideas here are directly from here https://github.com/keras-team/keras/issues/2588

    0 讨论(0)
提交回复
热议问题