How to specify log file different from daemon log file while submitting a flink job in a standalone flink cluster

十年热恋 提交于 2020-01-05 05:10:28

问题


When I am starting a flink standalone cluster, It logs daemon logs in a file mentioned in conf/log4j.properties file, and when I submit a flink job in that cluster, it uses same properties file to log the application logs and write into same log file on taskmanagers. I want to have separate log files for my each application submitted in that flink standalone cluster. Is there any way to achieve that


回答1:


When you submit the job using the ./bin/flink shell script, use the following environment variables to control log file location:

  • FLINK_LOG_DIR specifies the directory where the log will appear
  • FLINK_IDENT_STRING allows you to make the filename unique

For example if you start your job with

FLINK_LOG_DIR=/var/log FLINK_IDENT_STRING=my_app_id ./bin/flink run /path/to/the.jar

then the logs will appear in /var/log/flink-my_app_id-client-$HOSTNAME.log

Note that this only applies to the messages that are logged via the logging frameworks and not for the things that are just printed to stdout.



来源:https://stackoverflow.com/questions/40487401/how-to-specify-log-file-different-from-daemon-log-file-while-submitting-a-flink

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!