Pass system property to spark-submit and read file from classpath or custom path

拈花ヽ惹草 提交于 2019-12-05 03:27:39

1. Solving java.io.FileNotFoundException

This is probably unsolvable.

Simply, SparkContext.addFile can not read the file from inside the Jar. I believe it is treated as it was in some zip or alike.

Fine.

2. Passing -Dlogback.configurationFile

This was not working due to my misunderstanding of the configuration parameters.

Because I am using --master yarn parameter, but I do not specify --deploy-mode to cluster it is by default client.

Reading https://spark.apache.org/docs/1.6.1/configuration.html#application-properties

spark.driver.extraJavaOptions

Note: In client mode, this config must not be set through the SparkConf directly in your application, because the driver JVM has already started at that point. Instead, please set this through the --driver-java-options command line option or in your default properties file.

So passing this setting with --driver-java-options worked:

spark-submit \
  ...
  --driver-java-options "-Dlogback.configurationFile=/path/to/my/logback.xml" \
  --master yarn \
  --class com.company.Main\
  /path/to/my/application-fat.jar \
  param1 param2 

Note about --driver-java-options

In contrast to --conf multiple parameters have to be passed as one parameter, example:

--driver-java-options "-Dlogback.configurationFile=/path/to/my/logback.xml -Dother.setting=value" \

And the following will not work

--driver-java-options "-Dlogback.configurationFile=/path/to/my/logback.xml" \
--driver-java-options "-Dother.setting=value" \
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!