In this pytorch ResNet code example they define downsample as variable in line 44. and line 58 use it as function. How this downsample work here as CNN point of view and as python Code point of view.
code example : pytorch ResNet
i searched for if downsample is any pytorch inbuilt function. but it is not.
class BasicBlock(nn.Module): expansion = 1 def __init__(self, inplanes, planes, stride=1, downsample=None, groups=1, norm_layer=None): super(BasicBlock, self).__init__() if norm_layer is None: norm_layer = nn.BatchNorm2d if groups != 1: raise ValueError('BasicBlock only supports groups=1') # Both self.conv1 and self.downsample layers downsample the input when stride != 1 self.conv1 = conv3x3(inplanes, planes, stride) self.bn1 = norm_layer(planes) self.relu = nn.ReLU(inplace=True) self.conv2 = conv3x3(planes, planes) self.bn2 = norm_layer(planes) self.downsample = downsample self.stride = stride def forward(self, x): identity = x out = self.conv1(x) out = self.bn1(out) out = self.relu(out) out = self.conv2(out) out = self.bn2(out) if self.downsample is not None: identity = self.downsample(x) out += identity out = self.relu(out) return out
Advertisement
Answer
In this ResNet example, Here when we define BasicBlock class we pass downsample as constructor parameter.
def __init__(self, inplanes, planes, stride=1, downsample=None, groups=1, norm_layer=None):
if we pass nothing to class then downsample = None
, as result identity will not changed.
When we pass downsample = "some convolution layer"
as class constructor argument, It will downsample the identity via passed convolution layer to sucessfully perform addition. this layer will downsample the identity through code as mentioned
if self.downsample is not None: identity = self.downsample(x)