user image

Deepika Deepika

Job Interview Skills
English
2 years ago

What is Regularization and what kind of problems does regularization solve?

user image

Abhishek Mishra

2 years ago

Regularization is basically a technique that is used to push or encourage the coefficients of the machine learning model towards zero to reduce the over-fitting problem. The general idea of regularization is to penalize complicated models by adding an additional penalty to the loss function in order to generate a larger loss. In this way, we can discourage the model from learning too many details and the model is much more general. There are two ways of assigning the additional penalty term to the loss function giving rise to two types of regularization techniques. They are L2 Regularization L1 Regularization In L2 Regularization, the penalty term is the sum of squares of the magnitude of the model coefficients while in L1 Regularization, it is the sum of absolute values of the model coefficients.

Recent Doubts

Close [x]