介绍
前几天有人在交流群问,大图GNN该怎么办? 10W*10W的机器就爆了...

Mini-batch确实是一个解决方案, 通过采样每个节点的局部邻居信息并聚合来学习节点表示. 这样不用一次性处理完整的邻接矩阵A,对于任意大小的图都可以实现GNN的训练.
上面只是最简单的一个方法.实际上, 大规模图数据上的GNN是有专门的研究工作的.
推荐一个Awesome GNNs on Large-scale Graphs的GitHub, 里面搜集整理了一些大规模图上的GNN论文, 尤其是各种加速采样算法, 如node-wise, layer-wise, and subgraph sampling.
2021
[ICML 2021] GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings.
2020
[ICLR 2020] GraphSAINT: Graph Sampling Based Inductive Learning Method.
[KDD 2020] Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks.
[ICML Workshop 2020] SIGN: Scalable Inception Graph Networks.
[ICML 2020] Simple and Deep Graph Convolutional Networks.
[NeurIPS 2020] Scalable Graph Neural Networks via Bidirectional Propagation.
2019
[KDD 2019] Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks.
[NeurIPS 2019] Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks.
[ICML 2019] Simplifying Graph Convolution Networks.
2018
[ICLR 2018] FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling.
[KDD 2018] Large-Scale Learnable Graph Convolutional Networks.
[ICML 2018] Stochastic Training of Graph Convolutional Networks with Variance Reduction.
[NeurIPS 2018] Adaptive Sampling Towards Fast Graph Representation Learning.
2017
[NIPS 2017] Inductive Representation Learning on Large Graphs.
GitHub链接: https://github.com/Oceanusity/awesome-gnns-on-large-scale-graphs