Relation Extraction

A relationship extraction task requires the detection and classification of semantic relationship mentions within a set of artifacts, typically from text or XML documents. (Understanding Conference in 1998/ binary relations)
Approaches:

  1. text-based relationship extraction: rely on the use of pretrained relationship structure
  2. use of domain ontologies
  3. involves visual detection of meaningful relationships in parametric values of objects listed on a data table that shift positions as the table is permuted
    (http://en.wikipedia.org/wiki/Relationship_extraction)

Methods:
There are five different methods of doing Relation Extraction:
进行关系提取有五种不同的方法:

  1. Rule-based RE 基于规则的资源检索 - through hand-crafted patterns
  • Pro:
    • Humans can create pattern which tend to have high precisio
    • Can be tailored to specific domains
  • Cons:
    • A lot of manual work to create all possible rules
    • Have to create rules for every relation type
  1. Weakly Supervised RE 弱监督 RE - start out with a set of hand-crafted rules and automatically find new ones from the unlabeled text data, through and iterative process (bootstrapping).
  2. Supervised RE 监督式 RE
  3. Distantly Supervised RE 远程监督关系推理
  4. Unsupervised RE 无监督 RE
    (https://medium.com/@andreasherman/different-ways-of-doing-relation-extraction-from-text-7362b4c3169e)

model:(https://nlpprogress.com/english/relationship_extraction.html)
6. End-to-end model: It uses external lexical resources, such as WordNet, part-of-speech tags, dependency tags, and named entity tags.

ModelF1Paper / SourceCode
BERT-based Models
A-GCN (Tian et al., 2021)89.85Dependency-driven Relation Extraction with Attentive Graph Convolutional NetworksOfficial
Matching-the-Blanks (Baldini Soares et al., 2019)89.5Matching the Blanks: Distributional Similarity for Relation Learning
R-BERT (Wu et al. 2019)89.25Enriching Pre-trained Language Model with Entity Information for Relation Classificationmickeystroller’s Reimplementation
CNN-based Models
Multi-Attention CNN (Wang et al. 2016)88.0Relation Classification via Multi-Level Attention CNNslawlietAi’s Reimplementation
Attention CNN (Huang and Y Shen, 2016)84.3
85.9*Attention-Based Convolutional Neural Network for Semantic Relation Extraction
CR-CNN (dos Santos et al., 2015)84.1Classifying Relations by Ranking with Convolutional Neural Networkpratapbhanu’s Reimplementation
CNN (Zeng et al., 2014)82.7Relation Classification via Convolutional Deep Neural Networkroomylee’s Reimplementation
RNN-based Models
Entity Attention Bi-LSTM (Lee et al., 2019)85.2Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity TypingOfficial
Hierarchical Attention Bi-LSTM (Xiao and C Liu, 2016)84.3Semantic Relation Classification via Hierarchical Recurrent Neural Network with Attention
Attention Bi-LSTM (Zhou et al., 2016)84.0Attention-Based Bidirectional Long Short-Term Memory Networks for Relation ClassificationSeoSangwoo’s Reimplementation
Bi-LSTM (Zhang et al., 2015)82.7
84.3*Bidirectional long short-term memory networks for relation classification
  1. Dependency Models
ModelF1Paper / Source 论文/来源Code 代码
BRCNN (Cai et al., 2016)
BRCNN(Cai 等人,2016)
86.3Bidirectional Recurrent Convolutional Neural Network for Relation Classification
用于关系分类的双向循环卷积神经网络
DRNNs (Xu et al., 2016)
DRNN(Xu 等人,2016 年)
86.1Improved Relation Classification by Deep Recurrent Neural Networks with Data Augmentation
通过数据增强的深度循环神经网络改进关系分类
depLCNN + NS (Xu et al., 2015a)
depLCNN + NS(徐等人,2015a)
85.6Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling
通过简单负采样的卷积神经网络进行语义关系分类
SDP-LSTM (Xu et al., 2015b)
SDP-LSTM(徐等人,2015b)
83.7Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Path
通过长短期记忆网络沿最短依赖路径对关系进行分类
Sshanu’s Reimplementation
Sshanu 的重新实现"

"DepNN (Liu et al., 2015)
DepNN(Liu 等人,2015)
83.6A Dependency-Based Neural Network for Relation Classification
基于依赖关系的神经网络进行关系分类
FCN (Yu et al., 2014)
FCN(Yu 等人,2014)
83Factor-based compositional embedding models
基于因子的组合嵌入模型
MVRNN (Socher et al., 2012)
MVRNN(Socher 等人,2012)
82.4Semantic Compositionality through Recursive Matrix-Vector Spaces
通过递归矩阵向量空间实现语义组合
pratapbhanu’s Reimplementation
pratapbhanu 的重新实施
  1. New York Times Corpu
ModelP@10% 磷@10%P@30% 磷@30%Paper / Source 论文/来源Code 代码
KGPOOL (Nadgeri et al., 2021)
KGPOOL(Nadgeri 等人,2021 年)
92.386.7KGPool: Dynamic Knowledge Graph Context Selection for Relation Extraction
KGPool:用于关系提取的动态知识图谱上下文选择
KGPOOL
RECON (Bastos et al., 2021)
RECON(Bastos 等人,2021 年)
87.574.1RECON: Relation Extraction using Knowledge Graph Context in a Graph Neural Network
RECON:使用图神经网络中的知识图谱上下文进行关系提取
RECON
HRERE (Xu et al., 2019)
HRERE(Xu 等,2019)
84.972.8Connecting Language and Knowledge with Heterogeneous Representations for Neural Relation Extraction
将语言和知识与异构表示连接起来以进行神经关系提取
HRERE
PCNN+noise_convert+cond_opt (Wu et al., 2019)
PCNN+noise_convert+cond_opt(Wu 等人,2019 年)
81.761.8Improving Distantly Supervised Relation Extraction with Neural Noise Converter and Conditional Optimal Selector
使用神经噪声转换器和条件最优选择器改进远程监督关系提取
Intra- and Inter-Bag (Ye and Ling, 2019)
袋内和袋间 (Ye and Ling,2019)
78.962.4Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions
利用袋内和袋间注意力机制进行远程监督关系提取
Code 代码
RESIDE (Vashishth et al., 2018)
居住 (Vashishth 等人,2018)
73.659.5RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information
RESIDE:利用辅助信息改进远程监督神经关系提取
RESIDE
PCNN+ATT (Lin et al., 2016)
PCNN+ATT(Lin 等人,2016 年)
69.451.8Neural Relation Extraction with Selective Attention over Instances
通过对实例的选择性注意进行神经关系提取
OpenNRE 开放NRE
MIML-RE (Surdeneau et al., 2012)
MIML-RE(Surdeneau 等人,2012 年)
60.7+-Multi-instance Multi-label Learning for Relation Extraction
用于关系提取的多示例多标签学习
Mimlre 我的名字是
MultiR (Hoffman et al., 2011)
MultiR(Hoffman 等人,2011 年)
60.9+-Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations
基于知识的弱监督重叠关系信息提取
MultiR 多R
(Mintz et al., 2009) (Mintz 等人,2009 年)39.9+-Distant supervision for relation extraction without labeled data
无需标记数据即可进行远程监督关系提取
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值