Self-supervised Learning and Pre-training:研究图的自监督学习和预训练。预训练在NLP任务中大放光采,研究者希望此类技术在图上也能够进行预训练,并期望更好地辅助下游的任务。经典的自监督学习方法如对比学习等。
1.论文名称:Strategies for Pre-training Graph Neural Networks
链接:https://www.aminer.cn/pub/5e5e18eb93d709897ce3ce41
2.论文名称:CommDGI: Community Detection Oriented Deep Graph Infomax
链接:https://www.aminer.cn/pub/5f8ebbb99fced0a24b4e1994
3.论文名称:Inductive Representation Learning on Large Graphs.
链接:https://www.aminer.cn/pub/599c7988601a182cd2648a09
4.论文名称:InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization
链接:https://www.aminer.cn/pub/5e5e189a93d709897ce1e760
5.论文名称:GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
链接:https://www.aminer.cn/pub/5eede0b091e0116a23aafbd3
6.论文名称:Contrastive Multi-View Representation Learning on Graphs
链接:https://www.aminer.cn/pub/5ede0553e06a4c1b26a8419c
7.论文名称:Graph Contrastive Learning with Augmentations

最低0.47元/天 解锁文章
2万+

被折叠的 条评论
为什么被折叠?



