下面是一个没有池化层的卷积神经网络,第一层卷积是 2*2 的卷积核:

import torch.nn as nn

class ConvNet(nn.Module):
    def __init__(self):
        super(ConvNet, self).__init__()
        self.conv1 = nn.Conv2d(in_channels=3, out_channels=16, kernel_size=2, stride=1, padding=1)
        self.conv2 = nn.Conv2d(in_channels=16, out_channels=32, kernel_size=3, stride=1, padding=1)
        self.conv3 = nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, stride=1, padding=1)
        self.fc1 = nn.Linear(64*32*32, 256)
        self.fc2 = nn.Linear(256, 10)
        self.relu = nn.ReLU()
        
    def forward(self, x):
        x = self.conv1(x)
        x = self.relu(x)
        x = self.conv2(x)
        x = self.relu(x)
        x = self.conv3(x)
        x = self.relu(x)
        x = x.view(-1, 64*32*32)
        x = self.fc1(x)
        x = self.relu(x)
        x = self.fc2(x)
        return x

该网络有三个卷积层和两个全连接层。第一层卷积的卷积核大小为 22,其他卷积层的卷积核大小为 33。ReLU 函数作为激活函数。输出层有 10 个节点,对应着 10 个类别。

无池化层卷积神经网络:使用 2x2 卷积核

原文地址: https://www.cveoy.top/t/topic/ncpo 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录