《Relational Attention: Generalizing Transformers for Graph-Structured Tasks》【ICLR2023-spotlight】
文章目录动机RelationalTransformer1)RelationalAttention2)边更新先前工作Code:https://github.com/CameronDiao/relational-transformer动机标准的Transformer缺少relationalindectivebiases,而这在GNN中是非常常见的。Transformer的归纳偏置非常弱,几乎于没有,这