The class is composed of a set of attributes and functions including: Attributes: df : a pandas dataframe. numerical_feature_names: df columns with a numeric value. label_column_names: df string columns to be grouped. Functions: mean(nums): takes a list of numbers as input and returns the mean fill_na(df, numerical_feature_names, label_columns): takes class attributes as inputs and returns a transformed df. And here’s
Tag: machine-learning
NoSuchElementException: Failed to find a default value for layers in MultiLayerPerceptronClassifier
I am having a problem running a prediction using a saved MultiLayerPerceptronClassifier model. It throws error: The original mlpc in the pipeline had layers defined: My attempts to solve it: If I run the pipeline model and do predictions without first saving the model. I works with no error. But saving and re-using the model throws this error. Any help
TypeError: multiple values for argument ‘weight_decay’
I am using an AdamW optimizer that uses cosine decay with a warmup learning scheduler. I have written the custom scheduler from scratch and using the AdamW optimizer provided by the TensorFlow addons library. I get the following error prompt where it says that weight_decay has multiple arguments What is causing problem and how do I resolve this? Answer The
How to build a custom scaler based on StandardScaler?
I am trying to build a custom scaler to scale only the continuous variables on a dataset (the US Adult Income: https://www.kaggle.com/uciml/adult-census-income), using StandardScaler as a base. Here is my Python code that I used: However when I tried to run the scaler, I met this problem: So what is the error that I have on building the scaler? And
TypeError: fit() missing 1 required positional argument: ‘y’,
I want to try out all regressors within the library. Since I do know, that some of the regressors require more input I build the try and expept catch block. This returns the following snipped many times: In my opinion there are two problems here. First, exept never gets called. Second, the y input is not recognized. I am gratefull
I would try change channel for keras pretrained model
I got an Xception model. I have combined the model to change the Input Channel to 3. however I have got error Answer You simply have to embed Xception in the correct way in your new model: We create a new Input layer, than we operate upsampling and in the end we pass all to Xception Here is the running
Cross-validation with time series data in sklearn
I have a question with regard to cross-validation of time series data in general. The problem is macro forecasting, e.g. forecasting the 1-month ahead Price of the S&P500 using different monthly macro variables. Now I read about the following approach: One should/could use a rolling cross-validation approach. I.e. always drop an old monthly value and add a new one (=
Keras model fits on data with the wrong shape
I’ve created the following model: and the following dummy data: with the shapes of (4, None, 2) and (4, 3). Looking at the model structure one can see that the model has 3 outputs of shape (None, 1). I was wondering how come the fit works, when I expected they to be of shape (4, 3, 1) and not (4,
__init__() got an unexpected keyword argument ‘handle_unknown’
I’m trying to Ordinal Encode my categorical features using sklearn, but I get the error __init__() got an unexpected keyword argument ‘handle_unknown’ when I compile the below code: A sample data to reproduce the error: Could someone please tell me what’s wrong in my code? Answer You are most likely not using an appropriate version of scikit-learn. handle_unknown and unknown_value
How do I save the data that has been randomly undersampled?
I am trying to balance a data frame by using random undersampling of the majority class. It has been successful, however, I also want to save the data that has been removed from the data frame (undersampled) to a new data frame. How do I accomplish this? This is the code that I am using to undersample the data frame