注意力机制

https://github.com/HaloTrouvaille/YOLO-Multi-Backbones-Attention YOLO融合注意力机制 可以参考下

NeuralNetworks 2020 | 哈工大与北大提出注意力引导的图像去噪

mp.weixin.qq.com/s?__biz=MzIxOTczOTM4NA==&mid=2247490281&idx=2&sn=9d6eac5c230c98e37e56e933b293eee5&chksm=97d7ff7ea0a076680fe4a6de2a5830dd5b6925f7502ebd20f97b6fb28144098883ad00aca2f7&mpshare=1&scene=1&srcid=0823GwfSRxwENSV6Faf39aIE&sharer_sharetime=1598191845827&sharer_shareid=928b7467b20ce13509ea4ca43699acf7&key=790d61d00982c950179c35e26bbd578012a3710b72d17f4772dc2ff75a5f9966b2c9aee1194fd771055fedf021fa40a5a7a9ac0e794f24af0df9083d2a40598557389fd770fbc5765f5db585c68c33c284566b4401ecf134d79baa9b718748327a2be55a1c9ba6e2d04c842eaa0d5608f06a8c5d3a8b3d4dfbce5accc335aabe&ascene=1&uin=MTQzNDUwNDQxMg%3D%3D&devicetype=Windows+10+x64&version=62090529&lang=zh_CN&exportkey=A1MNAVrZo%2F%2BiQT2hPo4%2FIh0%3D&pass_ticket=UG6txnElpne32%2FJWG9oeV3wEqENPriPUgWRaaY1Dn%2Bz6%2BB7GYUPXXDEp%2BqZ0Tdev

YOLOv3 with 轻量级主干网(ShuffleNetV2, GhostNet), 注意力机制(SE Block, CBAM Block, ECA Block),和在GhostNet上的剪枝、量化、蒸馏。
https://github.com/HaloTrouvaille/YOLO-Multi-Backbones-Attention

注意力机制最新综述解读:https://blog.csdn.net/shenziheng1/article/details/89323074 

计算机视觉中的注意力机制

https://zr9558.com/2019/01/23/attentioninimagerecognition/

目前主流的attention方法都有哪些

https://www.zhihu.com/question/68482809/answer/264632289

注意力机制的基本思想和实现原理(很详细)

https://blog.csdn.net/hpulfc/article/details/80448570

 

 

老一套的卷积+注意力机制+循环神经网络=>集成模型

 

 

 

 

 

 

你可能感兴趣的:(深度学习)