Heap space error while executing logstash with a large dictionary (translate filter)

吃可爱长大的小学妹 提交于 2019-12-12 01:56:39

问题


I get an error java.lang.OutOfMemoryError: Java heap space while executing logstash with a large dictionary of 353 mega bytes in translate filter.

I use it to do a lookup on my input data.

I tried to allow the JVM to use more memory (with java -Xmx2048m). suppose i do it wrong because it has no effect.

I tested my config file with "smaller" dictionary and it worked fine. Any help please ? how to give logstash enough RAM to not die ?

My config file looks like that :

input {  
file {
  type => "MERGED DATA"
  path => "C:\logstash-1.4.1\bin\..."
  start_position => "beginning"
    sincedb_path => "/dev/null"}} 

 filter {
grok {
    match => [ "message", "..." ]} 

if (...") {
translate {dictionary_path => "C:\logstash-1.4.1\bin\DICTIONARY.yaml"  field => "Contact_ID"  destination => "DATA" fallback => "no match" refresh_interval => 60  }

grok {match => [ "DATA", "..." ]}

mutate {remove_field => ...}

else if ...

else if ...

mutate {   ... }
}
output { if [rabbit] == "INFO" {
  elasticsearch {
    host => "localhost"
            }
  stdout {}
}}

Thanks a lot.


回答1:


To increase heap size set the LS_HEAP_SIZE environment variable before launching logstash.

LS_HEAP_SIZE=2048m




回答2:


I was having the similar issue. Mine looks

logstash <Sequel::DatabaseError: Java::JavaLang::OutOfMemoryError: Java heap space 

To solve this, I had to add some settings in my logstash config file. I added the bellow settings in the jdc section

jdbc_paging_enabled => true
jdbc_page_size => 200000

You can have a look on this thread enter link description here



来源:https://stackoverflow.com/questions/27156145/heap-space-error-while-executing-logstash-with-a-large-dictionary-translate-fil

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!