解读AAAI 2021最佳论文:Informer: Beyond Efficient Transformer for Long SequenceTime-Series Forecasting
目录一、摘要二、传统transformer的局限性三、本文的贡献四、模型总体架构五、Self-attentionMechanism六、衡量Query稀疏度七、ProbSparseSelf-attention八、EncoderSelf-attentionDistilling九、Decoder十、Experiment消融实验编辑ComputationEfficiency十一、Reference一、摘要