论文大合集 | 大规模图上如何训练GNN?

介绍

前几天有人在交流群问,大图GNN该怎么办? 10W*10W的机器就爆了...

Mini-batch确实是一个解决方案, 通过采样每个节点的局部邻居信息并聚合来学习节点表示. 这样不用一次性处理完整的邻接矩阵A,对于任意大小的图都可以实现GNN的训练.

上面只是最简单的一个方法.实际上, 大规模图数据上的GNN是有专门的研究工作的.

推荐一个Awesome GNNs on Large-scale Graphs的GitHub, 里面搜集整理了一些大规模图上的GNN论文, 尤其是各种加速采样算法, 如node-wise, layer-wise, and subgraph sampling.

2021


  • [ICML 2021] GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings.

2020


  • [ICLR 2020] GraphSAINT: Graph Sampling Based Inductive Learning Method.

  • [KDD 2020] Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks.

  • [ICML Workshop 2020] SIGN: Scalable Inception Graph Networks.

  • [ICML 2020] Simple and Deep Graph Convolutional Networks.

  • [NeurIPS 2020] Scalable Graph Neural Networks via Bidirectional Propagation.

2019


  • [KDD 2019] Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks.

  • [NeurIPS 2019] Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks.

  • [ICML 2019] Simplifying Graph Convolution Networks.

2018


  • [ICLR 2018] FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling.

  • [KDD 2018] Large-Scale Learnable Graph Convolutional Networks.

  • [ICML 2018] Stochastic Training of Graph Convolutional Networks with Variance Reduction.

  • [NeurIPS 2018] Adaptive Sampling Towards Fast Graph Representation Learning.

2017


  • [NIPS 2017] Inductive Representation Learning on Large Graphs.

GitHub链接: https://github.com/Oceanusity/awesome-gnns-on-large-scale-graphs

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值