Skip to content
Advertisement

Why is Keras complaining about incompatible input shape in this case?

I have trained a Keras-based autoencoder model with the following input layer:

JavaScript

Width and height of my training images were 100 pixels in grayscale, thus with a depth of 1. Now I want to load my trained model in another script, load an image there, resize and send it to the Keras model:

JavaScript

However, the call to autoencoder.predict(image) leads to the following error:

JavaScript

I don’t understand this, as the shape of the image when calling predict() is (100, 100, 1), which looks fine to me. Why is Keras complaining about an incompatible input shape of (None, 100, 1, 1)?

Advertisement

Answer

These simple lines of code generate the error

JavaScript

This is because your Keras model expects data in this format (n_sample, 100, 100, 1)

A simple reshape when you predict a single image does the trick

JavaScript
User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement