- ERNIE: Enhanced Language Representation with Informative Entities. (ACL 2019), Wikipedia作为文本语料输入,WikiData作为知识图谱输入。底层模型对于文本进行建模,高层模型对于知识信息进行整合。
- COMET : Commonsense Transformers for Automatic Knowledge Graph Construction.ACL2019
- KnowBERT:Knowledge Enhanced Contextual Word Representations. (EMNLP 2019)
WKLM: Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language Model. (ICLR 2020). 弱监督方式,给定文本链接到wikidata,将部分文本进行替换,训练时预测文本是否被替换,loss为交叉墒 - K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters. (2020)
- KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation. (TACL2020)
ERNIE: Enhanced Language Representation with Informative Entities
清华团队,用kg 和bert 结合的方式学习预训练,模型用的Transformer。T-Encoder输入正常的token,K-Encoder层输入token和entity,token和entity进行融合,融合完的结果作为下一层的输入。mask的时候mask掉实体来学习。