What is the use of dropout layers in Neural Networks


Drop out ratio = 0.2 means to drop out random 20% percent of the input features and consequently a 20% percent of the corresponding neurons in each layer.

Now each iteration has a random 80% of the features.

We do the drop out to optimize the overfitting problem.

But when we use our test data we use all the features and their corresponding neurons.

Comments