Skip to content
Advertisement

Tag: hyperparameters

Hyperparameter tunning with wandb – CommError: Sweep user not valid when trying to initial the sweep

I’mt rying to use wandb for hyperparameter tunning as described in this notebook (but using my dataframe and trying to do it on random forest regressor instead). I’m trying to initial the sweep but I get the error: 400 response executing GraphQL. {“errors”:[{“message”:”Sweep user not valid”,”path”:[“upsertSweep”]}],”data”:{“upsertSweep”:null}} wandb: ERROR Error while calling W&B API: Sweep user not valid (<Response [400]>) CommError:

How to hypertune input shape using keras tuner?

I am trying to hypertune the input shape of an LSTM model based on the different values of timesteps. However, I am facing an issue. While initializing the model, the default value of timesteps (which is 2) is chosen, and accordingly, the build_model.scaled_train is created of shape (4096, 2, 64). Thus the value of input_shape during initialization is (2, 64).

How to specify Search Space in Auto-Sklearn

I know how to specify Feature Selection methods and the list of the Algorithms used in Auto-Sklearn 2.0 I know that Auto-Sklearn use Bayesian Optimisation SMAC but I would like to specify the HyperParameters in AutoSklearn For example, I want to specify random_forest with Estimator = 1000 only or MLP with HiddenLayerSize = 100 only. How to do that? Answer

Tuning the hyperparameter with gridsearch results in overfitting

Tuning the hyperparameter with gridsearch results in overfitting. The train error is definitely low, but the test error is high. Can’t you adjust the hyperparameter to lower the test error? before tuning train_error: 0.386055, test_error: 0.674069 -after tuning train_error: 0.070645, test_error: 0.708254 Answer It all depends on the data you are training. If the data you are using for training

Advertisement