in Data Science by
Describe different regularization methods, such as L1 and L2 regularization?

1 Answer

0 votes

Both L1 and L2 regularization are methods used to reduce the overfitting of training data. Least Squares minimizes the sum of the squared residuals, which can result in low bias but high variance.

L2 Regularization, also called ridge regression, minimizes the sum of the squared residuals plus lambda times the slope squared. This additional term is called the Ridge Regression Penalty. This increases the bias of the model, making the fit worse on the training data, but also decreases the variance.

If you take the ridge regression penalty and replace it with the absolute value of the slope, then you get Lasso regression or L1 regularization.

L2 is less robust but has a stable solution and always one solution. L1 is more robust but has an unstable solution and can possibly have multiple solutions.

Related questions

0 votes
asked Feb 6, 2020 in Data Science by rahuljain1
0 votes
asked Jul 24, 2019 in Azure by Robindeniel
0 votes
asked Mar 26, 2022 in SAP S/4HANA Authorization Concepts Overview by sharadyadav1986
+1 vote
asked Jan 17, 2022 in Artificial Intelligence by Robindeniel