1.
1.1 运行要求
(1) Python 3.5
(2)TensorFlow 1.3
(3) docopt
安装命令: conda install docopt
1.2简要介绍
共有4个版本的Graph Neural Networks:
(1)Gated Graph Neural Networks (one implementation using dense adjacency matrices and a sparse variant). The dense version is faster for small or dense graphs, including the molecules dataset (though the difference is small for it). In contrast, the sparse version is faster for large and sparse graphs, especially in cases where representing a dense representation of the adjacency matrix would result in prohibitively large memory usage.
(2) Asynchronous Gated Graph Neural Networks,Asynchronous GNNs do not propagate information from all nodes to all neighbouring nodes at each timestep; instead, they follow an update schedule such that messages are propagated in sequence. Their implementation is far more inefficient (due to the small number of updates at each step), but a single propagation round (i.e., performing each propagation step along a few edges once) can suffice to propagate messages across a large graph.
(3) Graph Convolutional Networks (sparse)
2.
2.1 下载数据
运行 get_data.py。需要 rdkit Python包。
conda install -c rdkit rdkit
conda 命令行在anaconda程序里打开对应环境的命令窗口。
import rdkit 出现 ImportError: DLL load failed: 找不到指定的模块:
在pycharm里面调用Anaconda里面的解释器,出现以上错误,解决方法:
将anaconda的Python版本必须为3.6的版本!!
数据下载链接:https://ndownloader.figshare.com/files/3195389/data/dsgdb9nsd.xyz.tar.bz2
2.2 运行
(1)运行 dense Gated Graph Neural Networks, 用
python3 ./chem_tensorflow_dense.py
(2)运行 sparse Gated Graph Neural Networks, 用
python3 ./chem_tensorflow_sparse.py
(3)运行 sparse Graph Convolutional Networks (as in Kipf et al. 2016), 用
python3 ./chem_tensorflow_gcn.py
(4)Finally, it turns out that the extension of GCN to different edge types is a variant of GGNN, and you can run GCN (as in Schlichtkrull et al. 2017) by calling
python3 ./chem_tensorflow_sparse.py --config '{"use_edge_bias": false, "use_edge_msg_avg_aggregation": true, "residual_connections": {}, "layer_timesteps": [1,1,1,1,1,1,1,1], "graph_rnn_cell": "RNN", "graph_rnn_activation": "ReLU"}'
(5)运行 asynchronous Gated Graph Neural Networks, use
python3 ./chem_tensorflow_async.py
2.3 模型恢复
Suppose you have trained a model e.g. the following trains for a single epoch:
python3 ./chem_tensorflow_dense.py --config '{"num_epochs": 1}'
== Epoch 1
Train: loss: 0.52315 | acc: 0:0.64241 | error_ratio: 0:9.65831 | instances/sec: 6758.04
Valid: loss: 0.26930 | acc: 0:0.55949 | error_ratio: 0:8.41163 | instances/sec: 9902.71
(Best epoch so far, cum. val. acc decreased to 0.55949 from inf. Saving to './2018-02-01-11-30-05_16306_model_best.pickle')
Note that a checkpoint was stored to './2018-02-01-11-30-05_16306_model_best.pickle'. To restore this model and continue training, use:
python3 ./chem_tensorflow_dense.py --restore ./2018-02-01-11-30-05_16306_model_best.pickle
参考资料
[1] Graph-to-Sequence Learning using Gated Graph Neural Networks
[2] 论文笔记:GGNN (门控图神经网络)
[3] 《GATED GRAPH SEQUENCE NEURAL NETWORKS》结合代码的论文阅读笔记
[4] 【笔记】GATED GRAPH SEQUENCE NEURAL NETWORKS
代码
[1] microsoft/gated-graph-neural-network-samples
[2] yujiali/ggnn
[3] calebmah/ggnn.pytorch
[4] JamesChuanggg/ggnn.pytorch
论文
[1] Gated graph sequence neural networks