大规模图训练论文合集

这篇博客汇总了关于大规模图训练的最新研究,包括优化采样技术如邻居节点采样、子图采样,以及分布式环境下的训练改造。论文涉及FastGCN、Cluster-GCN、GraphSAINT等多个模型,探讨了如何通过历史嵌入、PageRank等方法提升图神经网络的可扩展性和表达能力。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Large-scale Training:研究大规模图训练,如经典的GraphSAGE和PinSAGE中做大规模邻居结点采样的方法。补充一点,这类研究实际上在各类框架上也有,例如DGL,PyG,Euler等。一方面可以进行训练方式改进,如邻居结点采样/子图采样等;另一方面也可以进行训练环境的分布式改造,分布式环境下,原始大图切割为子图分布在不同的机器中,如何进行子图间的通信、跨图卷积等,也是很有挑战的难点。
小编整理了近来的大规模图训练相关论文推荐给大家:
1.论文名称:Inductive Representation Learning on Large Graphs.
链接:https://www.aminer.cn/pub/599c7988601a182cd2648a09
2.论文名称:FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling.
链接:https://www.aminer.cn/pub/5a9cb66717c44a376ffb8667
3.论文名称:Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks.
链接:https://www.aminer.cn/pub/5cf48a3eda56291d582a1174
4.论文名称:GraphSAINT: Graph Sampling Based Inductive Learning Method
链接:https://www.aminer.cn/pub/5e5e18a493d709897ce22b32
5.论文名称:GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings
链接:https://www.aminer.cn/pub/60bdde338585e32c38af510f
6.论文名称:Scaling Graph Neural Networks with Approximate PageRank
链接:https://www.aminer.cn/pub/5f02f17c91e011ee5e0258c8
7.论文名称:Stochastic Training of Graph Convolutional Networks with Variance Reduction.
链接: https://www.aminer.cn/pub/5c8d4bf34895d9cbc64e3332
8.论文名称:Adaptive Sampling Towards Fast Graph Representation Learning.
链接:https://www.aminer.cn/pub/5bdc31b817c44a1f58a0c039
9.论文名称:SIGN: Scalable Inception Graph Neural Networks
链接:https://www.aminer.cn/pub/5ea2b8bf91e01167f5a89d89
10.论文名称:Simplifying Graph Convolutional Networks.
链接:https://www.aminer.cn/pub/5cede109da562983788e9c8b
11.论文名称:GraphSAINT: Graph Sampling Based Inductive Learning Method
链接:https://www.aminer.cn/pub/5e5e18a493d709897ce22b32
12.论文名称:GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings
链接:https://www.aminer.cn/pub/60bdde338585e32c38af510f
13.论文名称:Deep Graph Neural Networks with Shallow Subgraph Samplers
链接:https://www.aminer.cn/pub/5fc88628dfae549b1c499be1
14.论文名称:Scalable Graph Neural Networks via Bidirectional Propagation
链接:https://www.aminer.cn/pub/5f7fdd328de39f0828397afd
15.论文名称:A Unified Lottery Ticket Hypothesis for Graph Neural Networks
链接:https://www.aminer.cn/pub/602b8eb491e0113d72356b4f
16.论文名称:Scaling Graph Neural Networks with Approximate PageRank
链接:https://www.aminer.cn/pub/5f02f17c91e011ee5e0258c8
17.论文名称:Scalable and Adaptive Graph Neural Networks with Self-Label-Enhanced training
链接:https://www.aminer.cn/pub/60801e3391e011772654f9bf
18.论文名称:Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks.
链接:https://www.aminer.cn/pub/5cf48a3eda56291d582a1174
19.论文名称:GraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding
链接:https://www.aminer.cn/pub/5e5e18b393d709897ce28ad3
20.论文名称:Global Neighbor Sampling for Mixed CPU-GPU Training on Giant Graphs.
链接:https://www.aminer.cn/pub/60c7fea791e0110a2be238c4
更多优质论文,尽在AMiner,主页添加关键词,系统智能推荐最新优质论文~
AMiner平台链接:
https://www.aminer.cn/?f=cs

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值