Skip to content
Advertisement

ValueError: Input 0 of layer conv2d_10 is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [None, 100, 100]

So I have been following a tutorial about Machine learning and I have come to this point in the code:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense,Dropout,Activation, Flatten, Conv2D, MaxPooling2D
import pickle
import numpy as np

pickle_in = open("X.pickle","rb")
X = pickle.load(pickle_in)

pickle_in = open("y.pickle","rb")
y = pickle.load(pickle_in)

X=np.array(X/255.0)
y=np.array(y)

model = Sequential()
model.add(Conv2D(64, (3,3), input_shape = X.shape[1:]))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Flatten())
model.add(Dense(64))

model.add(Dense(1))
model.add(Activation("sigmoid"))

model.compile(loss="binary_crossentropy",
             optimizer="adam",
             metrics=["accuracy"])
model.fit(X,y, batch_size=32, validation_split=0.1)

When I execute this code it gives me the following Error: ValueError: Input 0 of layer conv2d_10 is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [None, 100, 100] I have seen multiple posts about this and none have really helped me! Can anyone help?? Thanks in advance!! :)

Advertisement

Answer

Add a reshape since a conv2D layer expects (batch, x, y, channels), (ndim=4) but you are only providing it (batch, x, y), (ndim=3). Just reshape it to (batch, x, y, 1).

Error reads Full shape received: [None, 100, 100]. What it expects is a 4D array [None, 100, 100, 1]

model = Sequential()
model.add(Reshape((100,100,1),input_shape=X.shape[1:]))
model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Flatten())
model.add(Dense(64))

model.add(Dense(1))
model.add(Activation("sigmoid"))

model.compile(loss="binary_crossentropy",
             optimizer="adam",
             metrics=["accuracy"])


model.summary()
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
reshape_5 (Reshape)          (None, 100, 100, 1)       0         
_________________________________________________________________
conv2d_6 (Conv2D)            (None, 98, 98, 64)        640       
_________________________________________________________________
activation_9 (Activation)    (None, 98, 98, 64)        0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 49, 49, 64)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 47, 47, 64)        36928     
_________________________________________________________________
activation_10 (Activation)   (None, 47, 47, 64)        0         
_________________________________________________________________
max_pooling2d_7 (MaxPooling2 (None, 23, 23, 64)        0         
_________________________________________________________________
flatten_3 (Flatten)          (None, 33856)             0         
_________________________________________________________________
dense_6 (Dense)              (None, 64)                2166848   
_________________________________________________________________
dense_7 (Dense)              (None, 1)                 65        
_________________________________________________________________
activation_11 (Activation)   (None, 1)                 0         
=================================================================
Total params: 2,204,481
Trainable params: 2,204,481
Non-trainable params: 0
_________________________________________________________________
User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement