Skip to content
Advertisement

TensorFlow CNN Incompatible Shapes: 4D input shape

I have sample data in the form: Data[n][31][31][5][2] with:

  • “[n]” being the sample
  • “[31][31]” being the array of data points
  • “[5]” being the number of bits within that data point
  • and “[2]” being one-hot encoding of the bits (eg a bit of 1 would be [1, 0] and a zero [0, 1])

The output is intended to either be a [5][2] or a [10] array of values which is validated against another [5][2] or [10] array. When trying to build the model, I get the following error:

 "ValueError: Shapes (None, 5, 2) and (None, 10) are incompatible"

The model code looks like this: (with train_m[n][31][31][5][2], tr_m[5][2], check_m[n][31][31][5][2], cr_m[5][2] being training data and expected output followed by validation data and expected output.)

model = Sequential([
    Conv2D(num_filters, filter_size, input_shape=(31, 31, 5, 2)),
    Flatten(),
    Dense(10, activation='relu'),
])


model.compile(
  'adam',
  loss='categorical_crossentropy',
  metrics=['accuracy'],
)

model.summary()
model.fit(
    train_m,
    tr_m,
    epochs=(100),
    validation_data=(check_m, cr_m),
    verbose=0
)

As the [5][2] outputs are one-hotted, I’m uncertain if they can be made to a [10] matrix while still being interpreted correctly. Further, would there be any way to make the dense layer to a [5][2]?

The full error can be seen here. I felt it would be awfully long to include in rawtext here.

If there’s anything more that’s needed, please let me know – I’m still very new to working with TensorFlow.

Advertisement

Answer

Your label shapes are (5,2) but network output is (10,) so this is confusing. Both output shape and label shape should be the same. use:

tf.keras.layers.Reshape((5,2))

after the Dense layer. you’ll be fine

User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement