AAAI 2024|ETH轻量化Transformer最新研究,浅层MLP完全替换注意力模块提升性能
论文题目:RethinkingAttention:ExploringShallowFeed-ForwardNeuralNetworksasanAlternativetoAttentionLayersinTransformers论文链接:https://arxiv.org/abs/2311.10642代码仓库:GitHub-vulus98/Rethinking-attention:Myimpleme