【无标题】graphsage--inductive representation learing on large graphs

一、简单的总结

1、graphSASE针对新点甚至新图,主要训练aggregate函数
2、论文最后讨论未来可能的方向:subgraph embedding,邻居采样方式,多模态图

【无标题】graphsage--inductive representation learing on large graphs_第1张图片

二、数据集

1、citation data:undirected citation graph, 6 labels, 302424nodes, X= node degree+sentence embedding
2、reddit data : undirected citation graph,50 labels, 232965nodes,X= [post title embedding; post’s comments; post’s score; the number of comments]
3、PPI data: undirected citation graph,121 labels,(20+2+2)*2373nodes, X= [gene sets]

三、研究成果

1、节点分类任务
2、运行效率和参数分析(neighborhood sample size)
【无标题】graphsage--inductive representation learing on large graphs_第2张图片

四、论文related work

1、矩阵分解式的方法【factorization-based embedding approaches】
利用随机游走或者矩阵分解,这是直接 train node embedding,属于直推式学习,需要额外的训练to make prediction on new nodes
2、supervised learning over graph
graph kernel-based 方法,主要是classify entire grap,而不是单独的nodes的表示
3、graph convolution networkGCN
需要知道整张图的结构(邻接表) the full graph laplacian,本质并非是为了归纳式学习,并不是为一个局部的操作,不同于GAT以及graphSAGE
【无标题】graphsage--inductive representation learing on large graphs_第3张图片

五、embedding generation

1、embedding generation也指前向传播,forward propagation algorithm
2、本文中neighborhood固定:fixed-size set of neighbors,instead of using full neighborhood sets
【无标题】graphsage--inductive representation learing on large graphs_第4张图片

六、WL-test

【无标题】graphsage--inductive representation learing on large graphs_第5张图片

你可能感兴趣的:(NLP图神经网络,python,机器学习,人工智能)