Backbone Complexity Analysis: Comparing ResNet50, ResNest50, and Our Model
This section primarily focuses on analyzing the complexity of the backbone through the use of FLOPs (Floating-point Operations) and the number of parameters as indicators of neural network architecture complexity. To compare the complexity of ResNet50, ResNest50, and our model, we examined their respective FLOPs and number of parameters, as shown in Table 2. The results indicate that ResNest50 has approximately 1.8 million more parameters than ResNet50, and when the input image size is 512×512, ResNest50 is 40.7 G FLOPs more complex than ResNet50. Additionally, our model has roughly 1 million more parameters and 3.5 G FLOPs more complexity than ResNest50.
原文地址: https://www.cveoy.top/t/topic/jO5q 著作权归作者所有。请勿转载和采集!