Woo et al. proposed CBAM [18], a novel convolutional attention module that guides the neural network model to focus on relevant information adaptively. Extensive research has been conducted on the depth and width of deep neural networks to improve their performance. As shown in Figure 2, CBAM is a typical attention mechanism module consisting of two sequential sub-modules, the channel attention module (CAM) and the spatial attention module (SAM), which perform adaptive filtering of input features in the channel and spatial dimensions, respectively. The CAM first receives the input features. To efficiently calculate the channel attention, the tensor-wide channel feature matrix is computed using global maximum pooling and global average pooling, resulting in two matrices. These two weight matrices are then placed in the same multi-layer perceptron to learn the optimized weights. The two output components are then added together to form the channel weighting module. The Sigmoid activation function compresses the data between 0 and 1, which is then multiplied by the input features, as shown in Equations (1) and (2)

用学术英语来翻译下段话:Woo et al介绍了CBAM 18这是一种新型的卷积注意力模块它自适应地指导神经网络模型将信息聚焦到哪里。为了提高深度神经网络的性能已经对网络深度和宽度进行了大量研究。如图2所示CBAM是一种典型的注意力机制模块它有两个顺序的子模块通道注意模块CAM和空间注意模块SAM分别在通道和空间维度中执行输入特征的自适应滤波。通道注意模块首先接收输入特性。为了有效地计算通道注意力

原文地址: https://www.cveoy.top/t/topic/hfZO 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录