文章大纲
bert 简介
BERT, or 【Bidirectional Encoder Representations from Transformers】, is a new method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing (NLP) tasks.
BERT,或Transformers的双向编码器表示法,是一种新的预训练语言表示法,可在大量自然语言处理(NLP)任务中获得最先进的结果。
BERT 原始论文链接如下:
Our academic paper which describes BERT in detail and provides full results on a number of tasks can be found here: https://arxiv.org/abs/1810.04805.