LeveragingBaggingClassifier
Leveraging Bagging is an improvement over the Oza Bagging algorithm
. The bagging performance is leveraged by increasing the re-sampling. It uses a poisson distribution to simulate the re-sampling process. To increase re-sampling it uses a higher w value of the Poisson distribution (agerage number of events), 6 by default, increasing the input space diversity, by attributing a different range of weights to the data samples.
To deal with concept drift, Leveraging Bagging uses the ADWIN
algorithm to monitor the performance of each member of the enemble If concept drift is detected, the worst member of the ensemble (based on the error estimation by ADWIN) is replaced by a new (empty) classifier.
Parameters
-
model(
Model
) → The classifier to bag. -
n_models(
int
, Default:10
) → The number of models in the ensemble. -
w(
float
, Default:6
) → Indicates the average number of events. This is the lambda parameter of the Poisson distribution used to compute the re-sampling weight. -
bagging_method(
str
, Default:bag
) → The bagging method to use. Can be one of the following:bag
- Leveraging Bagging using ADWIN.me
- Assigns if sample is misclassified, otherwise.half
- Use resampling without replacement for half of the instances.wt
- Resample without taking out all instances.subag
- Resampling without replacement.
-
seed(
int
|None
, Default:None
) → Random number generator seed for reproducibility.
Example Usage
We can create an instance and deploy LBC model like this.
import turboml as tb
htc_model = tb.HoeffdingTreeClassifier(n_classes=2)
lbc_model = tb.LeveragingBaggingClassifier(n_classes=2, base_model = htc_model)