After referring to this link I was able to successfully implement incremental learning using XGBoost
. I want to build a classifier and need to check the predict probabilities i.e. predict_proba()
method. This is not possible if I use XGBoost
. While implementing XGBClassifier.fit()
instead of XGBoost.train()
I am not able to perform incremental learning. The xgb_model
parameter of the XGBClassifier.fit()
takes the XGBoost
while I want to provide an XGBClassifier
.
Is it possible to perform incremental learning of XGBClassifier
since I need to make use of predict_proba()
method?
Working Code:
import XGBoost as xgb train_data = xgb.DMatrix(X, y) model = xgb.train( params = best_params, dtrain = train_data, ) new_train_data = xgb.DMatrix(X_new, y_new) retrained_model = xgb.train( params = best_params, dtrain = new_train_data, xgb_model = model )
Above code runs perfectly but does not has an option of retrained_model.predict_proba()
Non-working Code:
import XGBoost as xgb xgb_model = xgb.XGBClassifier(**best_params) xgb_model.fit(X, y) retrained_model = xgb.XGBClassifier(**best_params) retrained_model.fit(X_new, y_new, xgb_model = xgb_model)
Above code does not work since it expects an XGBoost
model or Booster instance XGBoost
model to be loaded.
Error Trace:
[11:27:51] WARNING: ../src/learner.cc:1061: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. Traceback (most recent call last): File "/project/Data_Training.py", line 530, in train retrained_model.fit(X_new, y_new, xgb_model = xgb_model) File "/home/user/.local/lib/python3.6/site-packages/xgboost/core.py", line 422, in inner_f return f(**kwargs) File "/home/user/.local/lib/python3.6/site-packages/xgboost/sklearn.py", line 915, in fit callbacks=callbacks) File "/home/user/.local/lib/python3.6/site-packages/xgboost/training.py", line 236, in train early_stopping_rounds=early_stopping_rounds) File "/home/user/.local/lib/python3.6/site-packages/xgboost/training.py", line 60, in _train_internal model_file=xgb_model) File "/home/user/.local/lib/python3.6/site-packages/xgboost/core.py", line 1044, in __init__ raise TypeError('Unknown type:', model_file) TypeError: ('Unknown type:', XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1, colsample_bynode=1, colsample_bytree=1, gamma=0, gpu_id=-1, importance_type='gain', interaction_constraints='', learning_rate=1, max_delta_step=0, max_depth=3, min_child_weight=1, missing=nan, monotone_constraints='()', n_estimators=100, n_jobs=32, num_parallel_tree=1, random_state=0, reg_alpha=0, reg_lambda=1, scale_pos_weight=1, subsample=0.7, tree_method='exact', validate_parameters=1, verbosity=None))
Advertisement
Answer
From the docs:
xgb_model – file name of stored XGBoost model or ‘Booster’ instance[.] XGBoost model to be loaded before training (allows training continuation).
So you should be able to use xgb_model.get_booster()
to retrieve the underlying Booster
instance and pass that.
Also, you can get predicted probabilities out of the native xgboost API; Booster.predict
returns probabilities when objective='binary:logistic'
.