2) Reduce overfitting: Feature reduction and Dropouts

This is Part 2 of our article on how to reduce overfitting. If you missed Part 1, you can check it out here.

a. Feature Reduction:

Feature reduction i.e to Reduce the number of features is also termed Dimensionality Reduction.

  • One of the techniques to improve the performance of a machine learning model is to correctly select the features.
  • The idea is to remove all features that don’t add any information. If two variables are correlated, for example, it is better to remove one of them. 
  • If a feature has a too low variance, it doesn’t have any impact on what we are studying but can distort the results.
  • In this way, we simplify our data as much as possible, we improve the performance of the model and we reduce the risk of overfitting.
Feature Reduction
Reduce the number of features

One way to do this is to train the model several times. Each time we remove one of the features and study the impact on the training of the model. This technique can only be used on data with a small number of features.

On datasets that have too many features, we will need to implement dimension reduction methods.

b. Dropout:

Let me give you an example.: Suppose, you are in a group of 5 people, A1, A2, A3, A4, A5. Now, your instructor asks y’all to prepare a speech and present it in front of the whole school. 

Now you, A1, are leading the group and A3 is your best friend. You’ll be biased to give most of the speech to A3, right? Hence, A3 will have a higher rate of influencing the outcome of the speech as compared to how much A2, A4, and A5 would. This is overfitting! 

How? So, if A3 is not feeling well on the day of the presentation, then everyone faces the consequences, despite the fact that every other member is healthy. Do you see? One member’s actions play a big role in the overall result.

We use Dropout. Now, you should have some intuition about how Dropout could help. What we basically do is that we turn off some neurons randomly at each epoch and let the other learn and adapt. 

With Dropout, the training process essentially drops out neurons in a neural network. They are temporarily removed from the network, which can be visualized as follows:

Dropouts in Neural Network
Example of Dropout and its Impact on NN

Note that the connections or synapses are removed as well and that hence no data flows through these neurons anymore.

For more insights on the above example of dropout please refer to this article.

In this way, we: Generalize the model better and also avoid overfitting.

Similar Posts

Leave a Reply

Your email address will not be published.