Nan losses using “Learning Rate Step Decay” Scheduler with Adam Optimizer in Keras?

后端 未结 0 429
抹茶落季
抹茶落季 2021-01-27 23:37

I have this very deep model:

def get_model2(mask_kind):

decay = 0.0

inp_1 = keras.Input(shape=(64, 101, 1), name="RST_inputs")
x = layers.Conv2D(256,          


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题