You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"Both gradient boosting and random forest models improve when increasing the number of trees in the ensemble. However, the scores reach a plateau where adding new trees just makes fitting and scoring slower."
While this statement holds true for random forests, it does not apply to gradient boosting decision trees. In GBDT, adding too many estimators can lead to overfitting. This can be demonstrated by creating a validation curve for GBDT, similar to what is done for random forests (see attached image). The result actually supports the need for hyperparameter tuning or applying early stopping, as shown at the end of exercise M6.03.
The text was updated successfully, but these errors were encountered:
Exercise M6.03 states:
"Both gradient boosting and random forest models improve when increasing the number of trees in the ensemble. However, the scores reach a plateau where adding new trees just makes fitting and scoring slower."
While this statement holds true for random forests, it does not apply to gradient boosting decision trees. In GBDT, adding too many estimators can lead to overfitting. This can be demonstrated by creating a validation curve for GBDT, similar to what is done for random forests (see attached image). The result actually supports the need for hyperparameter tuning or applying early stopping, as shown at the end of exercise M6.03.
The text was updated successfully, but these errors were encountered: