What is Elastic Net Regularization for Regression?

Most of us know that ML models often tend to overfit to the training data for various reasons. This  could be due to lack of enough training data or the training data not being representative of data we expect to apply the model on. But the result is that we end up building an overly…

What are the different ways of preventing over-fitting in a deep neural network ? Explain the intuition behind each

L2 norm regularization : Make the weights closer to zero prevent overfitting. L1 Norm regularization : Make the weights closer to zero and also induce sparsity in weights. Less common form of regularization Dropout regularization : Ensure some of the hidden units are dropped out at random to ensure the network does not overfit by…