I want to hyper-parameter optimize multiple time series forecasting models on the same data. I’m using the Optuna Sweeper plugin for Hydra. The different models have different hyper-parameters and therefore different search spaces. At the moment my config file looks like this: Now, when I run the main_val.py file with –multirun, I get the optimal hyper-parameters for Ets. Great. But
Tag: optuna
Optuna lightgbm integration giving categorical features error
Im creating a model using optuna lightgbm integration, My training set has some categorical features and i pass those features to the model using the lgb.Dataset class, here is the code im using ( NOTE: X_train, X_val, y_train, y_val are all pandas dataframes ). Every time the lgb.train function is called, i get the following user warning I believe that
Best parameters of an Optuna multi-objective optimization
When performing a single-objective optimization with Optuna, the best parameters of the study are accessible using: If I want to perform a multi-objective optimization, this would be become for example : This works, but the command study.best_params fails with RuntimeError: The best trial of a ‘study’ is only supported for single-objective optimization. How can I get the best parameters for
Python: How to retrive the best model from Optuna LightGBM study?
I would like to get the best model to use later in the notebook to predict using a different test batch. reproducible example (taken from Optuna Github) : my understanding is that the study below will tune for accuracy. I would like to somehow retrieve the best model from the study (not just the parameters) without saving it as a