General Purpose
Adaptive XGBoost

Adaptive XGBoost

XGBoost implementation to handle concept drift based on Adaptive XGBoost for Evolving Data Streams1.

Parameters

  • n_classes(int) → The num_class parameter from XGBoost.

  • learning_rate(Default: 0.3) → The eta parameter from XGBoost.

  • max_depth(Default: 6) → The max_depth parameter from XGBoost.

  • max_window_size(Default: 1000) → Max window size for drift detection.

  • min_window_size(Default: 0) → Min window size for drift detection.

  • max_buffer(Default: 5) → Buffers after which to stop growing and start replacing.

  • pre_train(Default: 2) → Buffers to wait before the first XGBoost training.

  • detect_drift(Default: True) → If set will use a drift detector (ADWIN).

  • use_updater(Default: True) → Uses refresh updated for XGBoost.

  • trees_per_train(Default: 1) → The number of trees for each training run.

  • percent_update_trees(Default: 1.0) → The fraction of boosted rounds to be used for updates.

Example Usage

We can create an instance and deploy AdaptiveXGBoost model like this.

import turboml as tb
model = tb.AdaptiveXGBoost(n_classes=2)

Footnotes

  1. J. Montiel, R. Mitchell, E. Frank, B. Pfahringer, T. Abdessalem and A. Bifet Adaptive XGBoost for Evolving Data Streams (opens in a new tab)