What if I dont apply some activation function on some layers in a neural network. How will it affect the model. Take for instance the following code snippet :