Skip to content
Advertisement

Tag: lightgbm

`sklearn` asking for eval dataset when there is one

I am working on Stacking Regressor from sklearn and I used lightgbm to train my model. My lightgbm model has an early stopping option and I have used eval dataset and metric for this. When it feeds into the StackingRegressor, I saw this error ValueError: For early stopping, at least one dataset and eval metric is required for evaluation Which

LightGBM does not accept the dtypes of my data

I’m trying to use LGBMClassifier and for some reason, he does not accept the types of my data (all features are not accepted, I tested it). When we look at my data we can clearly see that all dtypes are either category, float or int (pd.DataFrame.info()) When I eventually try to train my LGBMClassifier I get the follwoing Error: Has

Get LightGBM/ LGBM run with GPU on Google Colabratory

I often run LGBM on Google Colabratory and I just found out this page saying that LGBM it set to CPU by default so you need to set up first. https://medium.com/@am.sharma/lgbm-on-colab-with-gpu-c1c09e83f2af So I executed the code recommended on the page or some other codes recommended on stackoverflow as follows, !git clone –recursive https://github.com/Microsoft/LightGBM %cd LightGBM !mkdir build %cd build !cmake

LightGBMError “Check failed: num_data > 0” with Sklearn RandomizedSearchCV

I’m trying LightGBMRegressor parameter tuning with Sklearn RandomizedSearchCV. I got an error with message below. error: I cannot tell why and the specific parameters caused this error. Any of params_dist below was not suitable for train_x.shape:(1630, 1565)? Please tell me any hints or solutions. Thank you. LightGBM version: ‘2.0.12’ function caused this error: Too long to put full stack trace,

Advertisement