主干网络的发展与训练策略 第二次作业

It has been nearly 10 years since the convolutional neural network AlexNet was first published on ImageNet in 2012. Neural networks have gone from academic terms to household names now, and artificial intelligence has exploded. For a scientific novice, the most intuitive feeling is that new papers emerge in endlessly, and the speed of reading papers is not as fast as the speed of writing papers. However, most of the CV work based on deep learning is inseparable from the extraction of image features, and most of the feature extraction work in 2021 has used several mature, reliable, and effective feature extraction networks, which we call backbone networks. On this basis, according to different application scenarios, different functional branches are connected to it, such as classification, detection, segmentation, etc., to achieve different task needs. Among these backbone networks, Resnet and EfficientNet are undoubtedly in the front row in terms of performance and reputation, as can be seen from the classification performance of ImageNet. Most of the time, the TOP methods are dominated by Resnet and EfficientNet or their various optimized, upgraded, integrated, modified versions. It happens that in the past half a year, there have been some retrospective articles on Resnet and EfficientNet, which have a great sense of summary, reflection and improvement. Starting from the birth of Resnet, this paper intersperses some of their improvements and reviews in roughly chronological order, and also feels the changes of scholars 'focus on network design from the side. Remember we mentioned when we looked at Resnet that Resnet is kind of like a numerical optimization method. Researchers from Waseda University have further discussed this. They view the existing Resnet architecture as a numerical optimization method of ordinary differential equations. By contrast, if the network architecture corresponding to a higher order optimization method is adopted, the performance of the model may also be improved while maintaining the same number of parameters.

你可能感兴趣的:(神经网络,卷积神经网络,python)