Elasticsearch对数字,英文字母等的分词N-gram tokenizer
Elasticsearch中提供了一个叫N-gram tokenizer的分词器,官方介绍如下
N-gram tokenizer
Thengramtokenizer first breaks text down into words whenever it encounters one of a list of specified characters, then it emitsN-gramsof each word of the specified length.
N-grams are ...
原创
2022-04-14 17:57:49 ·
4586 阅读 ·
0 评论