Is there a way to “escape” ElasticSearch stop words?

前端 未结 1 472
抹茶落季
抹茶落季 2021-01-24 15:43

I am fairly new to ElasticSearch and have a question on stop words. I have an index that contains state names for the USA....ex: New York/NY, California/CA,Oregon/OR. I belie

1条回答
  •  傲寒
    傲寒 (楼主)
    2021-01-24 16:21

    You can (and definitely should) control the way you index data by modifying your mapping according to your data and the way you want to search against it.

    In your case I would disable stopwords for that specific field rather than modifying the stopword list, but you could do the latter too if you wish to. The point is that you're using the default mapping which is great to start with, but as you can see you need to tweak it depending on your needs.

    For each field, you can specify what analyzer to use. An analyzer defines the way you split your text into tokens (tokenizer) that will be indexed and also additional changes you can make to each token (even remove or add new ones) using token filters.

    You can specify your mapping either while creating your index or update it afterwards using the put mapping api (as long as the changes you make are backwards compatible).

    0 讨论(0)
提交回复
热议问题