I’m building a CNN using Keras, with the following Conv1D as my first layer:
cnn.add(Conv1D( filters=512, kernel_size=3, strides=2, activation=hyperparameters["activation_fn"], kernel_regularizer=getattr(regularizers, hyperparameters["regularization"])(hyperparameters["regularization_rate"]), input_shape=(1000, 1), ))
I’m training with the function:
cnn.fit( x=train_df["payload"].tolist(), y=train_df["label"].tolist(), batch_size=hyperparameters["batch_size"], epochs=hyperparameters["epochs"], )
In which train_df is a pandas dataframe of two columns where, for each row, label is an int (0 or 1) and payload is a ndarray of floats padded with zeros/truncated to a length of 1000. The total # of training examples within train_df is 15641.
The model compiles, but during training, I get this error:
ValueError: Error when checking model input: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 1 array(s), but instead got the following list of 15641 arrays: [array([[0.09019608], [0.01176471], [0.01176471], [0. ], [0.30196078], [0. ], [0. ], [0. ], [0. ], [0....
I looked at this post and tried changing my input to a ndarray of 1000-float-long lists, but ended up with another error:
ValueError: Error when checking input: expected conv1d_1_input to have 3 dimensions, but got array with shape (15641, 1000)
Any ideas?
Advertisement
Answer
So I set the input_shape to (1000, 1)
I also converted the input that’s fed to fit() into a single ndarray of n ndarrays (each ndarray is a vector of 1000 floats, n is the total count of samples/vectors) and reshaped each of those ndarrays to (1, 1000, 1) during preprocessing after reading this explanation on inputs & input shape
The final shape of my input data was (15641, 1000, 1)
All of this should apply to validation data too (if specified).
This fixed my issue