Adaptive LightGBM
LightGBM implementation to handle concept drift based on Adaptive XGBoost for Evolving Data Streams1.
Parameters
-
n_classes(
int
) → Thenum_class
parameter from XGBoost. -
learning_rate(Default:
0.3
) → Theeta
parameter from XGBoost. -
max_depth(Default:
6
) → Themax_depth
parameter from XGBoost. -
max_window_size(Default:
1000
) → Max window size for drift detection. -
min_window_size(Default:
0
) → Min window size for drift detection. -
max_buffer(Default:
5
) → Buffers after which to stop growing and start replacing. -
pre_train(Default:
2
) → Buffers to wait before the first XGBoost training. -
detect_drift(Default:
True
) → If set will use a drift detector (ADWIN). -
use_updater(Default:
True
) → Usesrefresh
updated for XGBoost. -
trees_per_train(Default:
1
) → The number of trees for each training run.
Example Usage
We can create an instance and deploy AdaptiveLGBM model like this.
import turboml as tb
model = tb.AdaptiveLGBM(n_classes=2)
Footnotes
-
J. Montiel, R. Mitchell, E. Frank, B. Pfahringer, T. Abdessalem and A. Bifet Adaptive XGBoost for Evolving Data Streams (opens in a new tab) ↩