Recently, I use several dense layers for a regression task.
I use the SELU activations with lecun_normal for each dense layer.
However, when I add one AlphaDro