0 votes
in XGBoost by
What is meant by 'regularization' in XGBoost and how does it help in preventing overfitting?

1 Answer

0 votes
by
Regularization in XGBoost is a technique used to prevent overfitting by adding a penalty term to the model's objective function. This term discourages overly complex models and is there to ensure that the tree does rely solely on the most prominent features.

Regularization Parameters in XGBoost

XGBoost offers several hyperparameters for controlling the regularization strength:

L1 Regularization (alpha or reg_alpha): Encourages sparsity in feature selection by adding the sum of absolute values of the weights to the cost function.

L2 Regularization (lambda or reg_lambda or reg_weight): Also known as weight decay, it adds the sum of squares of the weights to the cost function.

 

Max Depth (max_depth): Another form of regularization, it limits the maximum depth a tree can grow to.

Minimum Child Weight (min_child_weight): This parameter also assists in controlling leaf node size, thereby preventing splitting in specific cases when a leaf is too weighted.
...