How to tokenize entire index automatically?

前端 未结 0 1103
谎友^
谎友^ 2021-02-07 00:14

I\'d like to automatically apply n-gram tokenization on an entire Elasticsearch index.

The docs mention ultimately running an analysis to apply a tokenizer, but the analy

相关标签:
回答
  • 消灭零回复
提交回复
热议问题