Why add a dropout layer leading to large val_loss and val_mae?

前端 未结 0 624
感情败类
感情败类 2020-12-30 16:20

Recently, I use several dense layers for a regression task.

I use the SELU activations with lecun_normal for each dense layer.

However, when I add one AlphaDro

相关标签:
回答
  • 消灭零回复
提交回复
热议问题