NIPS2019 | 2019NIPS论文 | NeurIPS2019最新更新论文~持续更新| NIPS2019百度云下载

论文下载百度云链接:链接:https://pan.baidu.com/s/100OAXTIOTPoMjbi-dwOcxA 
提取码:请关注【计算机视觉联盟】微信公众号,回复:NIPS2019

今天更新到2019年9月4号

目录

今天更新到2019年9月4号

Understanding the Representation Power of Graph Neural Networks in Learning Graph Topology

多模态元学习,Toward Multimodal Model-Agnostic Meta-Learning

A Graph Theoretic Framework of Recomputation Algorithms for Memory-Efficient Backpropagation

RUBi: Reducing Unimodal Biases in Visual Question Answering 

理解图神经网络中的注意力与泛化机制,Understanding Attention and Generalization in Graph Neural Networks

Facebook提出跨语言预训练模型XLM,Cross-lingual Language Model Pretraining

超图卷积神经网络, HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs

四元知识图谱嵌入,Quaternion Knowledge Graph Embeddings

理解医学图像中的迁移学习,Transfusion: Understanding Transfer Learning for Medical Imaging

人工智能和机器学习领域的国际顶级会议NeurIPS 2019公布了接受论文,有效提交论文6743篇论文, 总共有1428接受论文, 21.1%接受率,包括36篇Oral,164篇Spotlights。

NIPS2019 | 2019NIPS论文 | NeurIPS2019最新更新论文~持续更新| NIPS2019百度云下载_第1张图片

NeurIPS是人工智能和机器学习领域的国际顶级会议,由NIPS基金会负责运营。该会议全称为神经信息处理系统大会(Conference and Workshop on Neural Information Processing Systems,NIPS),自1987年开始,每年的12月份,来自世界各地的从事AI和ML相关的专家学者和从业人士汇聚一堂。受其名称歧义带来的压力(部分原因是其首字母缩写具有「暧昧的内涵」,带有性别歧视的意义),2018年的会议名称改为NeurIPS 。

NeurIPS 2019将在12月8号加拿大温哥华会议中心举行。

 

NeurIPS 2019接受论文推荐

 理解图神经网络的表示能力,

Understanding the Representation Power of Graph Neural Networks in Learning Graph Topology

https://arxiv.org/abs/1907.05008

 

Visualizing the PHATE of Neural Networks,

https://arxiv.org/abs/1908.02831

多模态元学习,Toward Multimodal Model-Agnostic Meta-Learning

https://arxiv.org/pdf/1812.07172.pdf

A Graph Theoretic Framework of Recomputation Algorithms for Memory-Efficient Backpropagation

https://arxiv.org/abs/1905.11722

RUBi: Reducing Unimodal Biases in Visual Question Answering 

http://arxiv.org/abs/1906.10169

Code: http://github.com/cdancette/rubi.bootstrap.pytorch

理解图神经网络中的注意力与泛化机制,Understanding Attention and Generalization in Graph Neural Networks

https://arxiv.org/pdf/1905.02850.pdf

Facebook提出跨语言预训练模型XLM,Cross-lingual Language Model Pretraining

https://arxiv.org/pdf/1901.07291.pdf

超图卷积神经网络, HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs

https://arxiv.org/abs/1809.02589

四元知识图谱嵌入,Quaternion Knowledge Graph Embeddings

https://arxiv.org/pdf/1904.10281.pdf

理解医学图像中的迁移学习,Transfusion: Understanding Transfer Learning for Medical Imaging

https://arxiv.org/pdf/1902.07208.pdf

你可能感兴趣的:(AI)