Tokenizer vs token filters

时光毁灭记忆、已成空白 提交于 2019-11-28 17:39:33

问题


I'm trying to implement autocomplete using Elasticsearch thinking that I understand how to do it...

I'm trying to build multi-word (phrase) suggestions by using ES's edge_n_grams while indexing crawled data.

What is the difference between a tokenizer and a token_filter - I've read the docs on these but still need more understanding on them....

For instance is a token_filter what ES uses to search against user input? Is a tokenizer what ES uses to make tokens? What is a token?

Is it possible for ES to create multi-word suggestions using any of these things?


回答1:


A tokenizer will split the whole input into tokens and a token filter will apply some transformation on each token.

For instance, let's say the input is The quick brown fox. If you use an edgeNGram tokenizer, you'll get the following tokens:

  • T
  • Th
  • The
  • The (last character is a space)
  • The q
  • The qu
  • The qui
  • The quic
  • The quick
  • The quick (last character is a space)
  • The quick b
  • The quick br
  • The quick bro
  • The quick brow
  • The quick brown
  • The quick brown (last character is a space)
  • The quick brown f
  • The quick brown fo
  • The quick brown fox

However, if you use a standard tokenizer which will split the input into words/tokens, and then an edgeNGram token filter, you'll get the following tokens

  • T, Th, The
  • q, qu, qui, quic, quick
  • b, br, bro, brow, brown
  • f, fo, fox

As you can see, choosing between an edgeNgram tokenizer or token filter depends on how you want to slice and dice your text and how you want to search it.

I suggest having a look at the excellent elyzer tool which provides a way to visualize the analysis process and see what is being produced during each step (tokenizing and token filtering).

As of ES 2.2, the _analyze endpoint also supports an explain feature which shows the details during each step of the analysis process.



来源:https://stackoverflow.com/questions/37168764/tokenizer-vs-token-filters

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!