问题
Update 1
I updated my lr according to " you want to be 10x back from that point, regardless of slope." and set it to max_lr=-slice(1e-3, 1e-2)
And here is what I got
And the plots
What does this mean?
As you can see in the 2nd graph that
the loss was very good starting from 1e-08, but I never set my lr to 1e-08, why do I see this??
the loss went up and down between 1e-07 and 1e-04 and eventually it soared to almost 0.05 when the lr came back around 4e-05. What does this mean? Overfitting? How come initially when the Learning Rate was around the same value(4e-05) the loss looked okay?
from the Batches processed/Loss, I can see that train_loss and valid_loss went together and looked really well. This means the model was trained very well? If it was well trained, why the shot-up at the end of graph 2?
I have followed the rule about picking up the correct lr, why does not it work? May I conclude that the lr_find() does not work properly?
Here is my lr_find() plot
then according to its graph, I picked up the steepest slope section: 1e-2 to 1e-1 as my lr.
Here is the code:
learn.fit_one_cycle(20, max_lr=slice(1e-2,1e-1))
But here is what I got during training
And here are the plots for learn.recoder
learn.recorder.plot_lr()
learn.recorder.plot()
learn.recorder.plot_losses()
As you can see the valid_loss is getting worse cyclically. So my conclusion is lr_find() method doesn’t work properly.
How can I verify it?
If you want to see the entire code, here it is; the only difference is I use to_fp16()
:
learn = cnn_learner(data, models.resnet50, metrics=error_rate).to_fp16()
https://forums.fast.ai/t/train-loss-and-valid-loss-look-very-good-but-predicting-really-bad/60925
来源:https://stackoverflow.com/questions/59549092/fastai-lrfind-method-doesn-t-work-properly