Skip to content
Advertisement

PyTorch Linear layer input dimension mismatch

Im getting this error when passing the input data to the Linear (Fully Connected Layer) in PyTorch:

matrices expected, got 4D, 2D tensors

I fully understand the problem since the input data has a shape (N,C,H,W) (from a Convolutional+MaxPool layer) where:

  • N: Data Samples
  • C: Channels of the data
  • H,W: Height and Width

Nevertheless I was expecting PyTorch to do the “reshaping” of the data form:

  • [ N , D1,…Dn] –> [ N, D] where D = D1*D2*….Dn

I try to reshape the Variable.data, but I’ve read that this approach is not recommended since the gradients will conserve the previous shape, and that in general you should not mutate a Variable.data shape.

I am pretty sure there is a simple solution that goes along with the framework, but i haven’t find it.

Is there a good solution for this?

PD: The Fully connected layer has as input size the value C * H * W

Advertisement

Answer

After reading some Examples I found the solution. here is how you do it without messing up the forward/backward pass flow:

(_, C, H, W) = x.data.size()
x = x.view( -1 , C * H * W)
User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement