问题
When I am starting a flink standalone cluster, It logs daemon logs in a file mentioned in conf/log4j.properties file, and when I submit a flink job in that cluster, it uses same properties file to log the application logs and write into same log file on taskmanagers. I want to have separate log files for my each application submitted in that flink standalone cluster. Is there any way to achieve that
回答1:
When you submit the job using the ./bin/flink shell script, use the following environment variables to control log file location:
- FLINK_LOG_DIR specifies the directory where the log will appear
- FLINK_IDENT_STRING allows you to make the filename unique
For example if you start your job with
FLINK_LOG_DIR=/var/log FLINK_IDENT_STRING=my_app_id ./bin/flink run /path/to/the.jar
then the logs will appear in /var/log/flink-my_app_id-client-$HOSTNAME.log
Note that this only applies to the messages that are logged via the logging frameworks and not for the things that are just printed to stdout.
来源:https://stackoverflow.com/questions/40487401/how-to-specify-log-file-different-from-daemon-log-file-while-submitting-a-flink