+1 vote
in Data Science by
Describe different regularization methods, such as L1 and L2 regularization?

1 Answer

0 votes
by

Both L1 and L2 regularization are methods used to reduce the overfitting of training data. Least Squares minimizes the sum of the squared residuals, which can result in low bias but high variance.

L2 Regularization, also called ridge regression, minimizes the sum of the squared residuals plus lambda times the slope squared. This additional term is called the Ridge Regression Penalty. This increases the bias of the model, making the fit worse on the training data, but also decreases the variance.

If you take the ridge regression penalty and replace it with the absolute value of the slope, then you get Lasso regression or L1 regularization.

L2 is less robust but has a stable solution and always one solution. L1 is more robust but has an unstable solution and can possibly have multiple solutions.

Related questions

0 votes
asked Mar 2, 2022 in API Gateways by DavidAnderson
+1 vote
asked Jan 29, 2020 in Azure by Tate
...