用于推荐系统的自监督超图Transformer 笔记

本文提出了一种新颖的自监督超图Transformer框架SHT,通过探索用户间的全局协作关系,解决推荐系统中数据噪声和偏斜分布的问题。SHT通过超图神经网络保持用户和项目之间的全局协作效果,并利用跨视图生成式自我监督学习进行数据增强,提升推荐系统的鲁棒性。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

1 Title

        Self-Supervised Hypergraph Transformer for Recommender Systems(Lianghao Xia, Chao Huang, Chuxu Zhang)【KDD 2022】

2 Conclusion

        User behavior data in many practical recommendation scenarios is often noisy and exhibits skewed distribution, which may result in suboptimal representation performance in GNN-based
models. In this paper, we propose SHT, a novel Self-Supervised Hypergraph Transformer framework (SHT) which augments user representations by exploring the global collaborative relationships in an explicit way. Specifically, we first empower the graph neural CF paradigm to maintain global collaborative effects among users and items with a hypergraph transformer network. With the distilled global context, a cross-view generative self-supervised learning component is proposed for data augmentation over the user-item interaction graph, so as to enhance the robustness of recommender systems.

3 Good Sentence

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值