[解读] Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models
链接:https://arxiv.org/abs/1911.12287v1项目地址:https://github.com/giannisdaras/ylg解读:https://www.leiphone.com/news/201912/FBZsLSCZSgyD5fIq.html相关的工作SAGAN[26]是一个添加了自注意力的GAN,SAGAN中的注意力层是密集的,它有一些缺点,首先是计算复杂度非常