文章作者:梦家
个人站点:dreamhomes.top
原文地址:https://dreamhomes.github.io/posts/202005191115.html
公众号ID:DreamHub
图上的卷积操作主要包含两部分:节点消息传递与消息聚集。假设 x i ( k − 1 ) ∈ R F \mathbf{x}_i^{(k-1)} \in \mathbb{R}^{F} xi(k−1)∈RF 表示 k − 1 k-1 k−1层节点的特征, e j , i ∈ R D \mathbf{e}_{j, i} \in \mathbb{R}^{D} ej,i∈RD表示节点 j j j到节点 i i i 的边的特征。那么消息传递的图神经网络可以表示为:
x i ( k ) = γ ( k ) ( x i ( k − 1 ) , □ j ∈ N ( i ) ϕ ( k ) ( x i ( k − 1 ) , x j ( k − 1 ) , e j , i ) ) \mathbf{x}_{i}^{(k)}=\gamma^{(k)}\left(\mathbf{x}_{i}^{(k-1)}, \square_{j \in \mathcal{N}(i)} \phi^{(k)}\left(\mathbf{x}_{i}^{(k-1)}, \mathbf{x}_{j}^{(k-1)}, \mathbf{e}_{j, i}\right)\right) xi(k)=γ(k)(xi(k−1),□j∈N(i)ϕ(k)(xi(k−1),xj(k−1),ej,i))
其中 □ \square □ 表示可微分的排列不变函数,e.g. sum,mean,max。 λ \lambda λ和 γ \gamma γ 表示可微分的函数,e.g. MLPs。
PyG 中 torch_geometric.nn.MessagePassing
提供一系列的消息传递方法来自动处理消息传播过程。
接下来以构造经典的kipf提出的 G C N GCN GCN为例。
GCN层的数学定义如下:
x i ( k ) = ∑ j ∈ N ( i ) ∪ { i } 1 deg ( i ) ⋅ deg ( j ) ⋅ ( Θ ⋅ x j ( k − 1 ) ) \mathbf{x}_{i}^{(k)}=\sum_{j \in \mathcal{N}(i) \cup\{i\}} \frac{1}{\sqrt{\operatorname{deg}(i)} \cdot \sqrt{\operatorname{deg}(j)}} \cdot\left(\boldsymbol{\Theta} \cdot \mathbf{x}_{j}^{(k-1)}\right) xi(k)=j∈N(i)∪{i}∑deg(i)⋅deg(j)1⋅(Θ⋅xj(k−1))
由上式可知节点特征首先经过 Θ \Theta Θ 进行特征变换,然后根据度进行归一化然后求和。计算公式可以拆分为以下几步:
add
聚集)以上过程基于 PyG 的实现如下:
import torch
from torch_geometric.nn import MessagePassing
from torch_geometric.utils import add_self_loops, degree
class GCNConv(MessagePassing):
def __init__(self, in_channels, out_channels):
super(GCNConv, self).__init__(aggr='add') # "Add" aggregation.
self.lin = torch.nn.Linear(in_channels, out_channels)
def forward(self, x, edge_index):
# x has shape [N, in_channels]
# edge_index has shape [2, E]
# Step 1: Add self-loops to the adjacency matrix.
edge_index, _ = add_self_loops(edge_index, num_nodes=x.size(0))
# Step 2: Linearly transform node feature matrix.
x = self.lin(x)
# Step 3: Compute normalization
row, col = edge_index
deg = degree(row, x.size(0), dtype=x.dtype)
deg_inv_sqrt = deg.pow(-0.5)
norm = deg_inv_sqrt[row] * deg_inv_sqrt[col]
# Step 4-6: Start propagating messages.
return self.propagate(edge_index, size=(x.size(0), x.size(0)), x=x,
norm=norm)
def message(self, x_j, norm):
# x_j has shape [E, out_channels]
# Step 4: Normalize node features.
return norm.view(-1, 1) * x_j
def update(self, aggr_out):
# aggr_out has shape [N, out_channels]
# Step 6: Return new node embeddings.
return aggr_out
定义好卷积层后即可调用卷积层进行堆叠:
conv = GCNConv(16, 32)
x = conv(x, edge_index)
这种方式个人用得较少,简单记录下。对于点云数据的卷积定义为:
x i ( k ) = max j ∈ N ( i ) h Θ ( x i ( k − 1 ) , x j ( k − 1 ) − x i ( k − 1 ) ) \mathbf{x}_{i}^{(k)}=\max _{j \in \mathcal{N}(i)} h_{\Theta}\left(\mathbf{x}_{i}^{(k-1)}, \mathbf{x}_{j}^{(k-1)}-\mathbf{x}_{i}^{(k-1)}\right) xi(k)=j∈N(i)maxhΘ(xi(k−1),xj(k−1)−xi(k−1))
基于 PyG 实现方式如下:
import torch
from torch.nn import Sequential as Seq, Linear, ReLU
from torch_geometric.nn import MessagePassing
class EdgeConv(MessagePassing):
def __init__(self, in_channels, out_channels):
super(EdgeConv, self).__init__(aggr='max') # "Max" aggregation.
self.mlp = Seq(Linear(2 * in_channels, out_channels),
ReLU(),
Linear(out_channels, out_channels))
def forward(self, x, edge_index):
# x has shape [N, in_channels]
# edge_index has shape [2, E]
return self.propagate(edge_index, size=(x.size(0), x.size(0)), x=x)
def message(self, x_i, x_j):
# x_i has shape [E, in_channels]
# x_j has shape [E, in_channels]
tmp = torch.cat([x_i, x_j - x_i], dim=1) # tmp has shape [E, 2 * in_channels]
return self.mlp(tmp)
def update(self, aggr_out):
# aggr_out has shape [N, out_channels]
return aggr_out
from torch_geometric.nn import knn_graph
class DynamicEdgeConv(EdgeConv):
def __init__(self, in_channels, out_channels, k=6):
super(DynamicEdgeConv, self).__init__(in_channels, out_channels)
self.k = k
def forward(self, x, batch=None):
edge_index = knn_graph(x, self.k, batch, loop=False, flow=self.flow)
return super(DynamicEdgeConv, self).forward(x, edge_index)
conv = DynamicEdgeConv(3, 128, k=6)
x = conv(pos, batch)
了解以上内容即可知道如何自定义 GNN 计算方式。
更多内容参考官网教程:https://pytorch-geometric.readthedocs.io/en/latest/index.html