论文浅尝 | Complex Embeddings for Simple Link Prediction

本文介绍了一种用于链接预测的复数嵌入(ComplEx)方法,旨在更好地处理知识库中的对称和非对称关系。通过复数表示,解决了实数向量表示无法有效区分非对称关系的问题。实验表明,ComplEx在对称和非对称关系的链接预测上表现出色,且在WN18和FB15数据集上优于其他模型。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

The ́o Trouillon, Johannes Welb, Sebastian Riedel, ÉricGaussier, Guillaume Bouchard . Complex Embeddings for Simple Link Prediction. In Proceedings of the 33ndInternational Conference on Machine Learning, pages 2071– 2080 (ICML2016)

论文链接:http://proceedings.mlr.press/v48/trouillon16.pdf

统计关系学习里,链接预测问题是自动理解规模知识库结心。为好得把握知识库二元关系中对称和非对称关系,本文提出了于复数的表示方法 ComplEx。

一些研究工作将链接预测看作是三维二元张量补全的问题,张量的每一

### Knowledge Graph Link Prediction Frameworks for Retrieval and Reading Knowledge graph link prediction involves predicting missing links or relationships between entities within a knowledge graph. For this purpose, several frameworks have been developed to enhance both retrieval and reading capabilities. #### 1. TransE Model TransE is one of the foundational models used in knowledge graph embedding methods. It represents each entity and relation as vectors in low-dimensional space where the score function measures how well a triplet $(h,r,t)$ holds by computing $f_r(h,t)=||\mathbf{e}_h+\mathbf{r}-\mathbf{e}_t||_2$[^1]. This model assumes that relations can be translated from head entities to tail entities through vector addition operations. #### 2. Convolutional Knowledge Graph Embeddings (ConvE) ConvE extends traditional embeddings like TransE by incorporating convolution layers over subject-predicate pairs before applying fully connected layers followed by reshaping into two-dimensional matrices. The scoring mechanism uses these transformed representations alongside object embeddings via element-wise multiplication and subsequent summation across all dimensions[^2]. ```python import torch.nn.functional as F from torch import nn class ConvE(nn.Module): def __init__(self, num_entities, num_relations, embedding_dim=200): super(ConvE, self).__init__() self.entity_embeddings = nn.Embedding(num_entities, embedding_dim) self.relation_embeddings = nn.Embedding(num_relations, embedding_dim) def forward(self, sub, rel): e_s = self.entity_embeddings(sub).view(-1, 1, 10, 20) # Example reshape r = self.relation_embeddings(rel).view(-1, 1, 10, 20) # Example reshape stacked_inputs = torch.cat([e_s, r], dim=-1) x = F.relu(stacked_inputs) return x ``` #### 3. ComplEx Model ComplEx addresses limitations found in earlier approaches such as inability to handle symmetric/antisymmetric properties effectively. By introducing complex-valued embeddings instead of real numbers only, it allows modeling asymmetric interactions more accurately while maintaining computational efficiency similar to simpler models[^3]. For applications requiring advanced reasoning tasks beyond simple triplets, architectures combining neural networks with symbolic logic offer promising solutions not covered directly here but worth exploring further based on specific requirements.
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值