cant override Typesafe configuration on commmandline in spark

此生再无相见时 提交于 2019-12-24 03:40:16

问题


I have a typesafe configuration application.conf in the src/main/resourcesfolder which is loaded by default.

A single value can be overridden by specifying:

--conf spark.driver.extraJavaOptions=-DsomeValue="foo"

However, specifying a complete new, i.e. overriding application.conf file like:

spark-submit \
    --class my.Class \
    --master "local[2]" \
    --files foo.conf \
    --conf spark.driver.extraClassPath="-Dconfig.file=file:foo.conf" \
    --conf spark.driver.extraJavaOptions=-Dvalue="abcd" \
    job.jar

will fail to load foo.conf. Instead, the original file from the resources folder will be loaded. Trying the tricks from: Using typesafe config with Spark on Yarn did not help as well.

edit

Overriding multiple config values in Typesafe config when using an uberjar to deploy seems to be the answer for plain (without spark) programs. The question remains how to bring this to spark.

Also passing:

--conf spark.driver.extraClassPath="-Dconfig.resource=file:foo.conf"
--conf spark.driver.extraClassPath="-Dconfig.resource=foo.conf"

fails to load my configuration from the command line .

Though, according to the docs:

https://github.com/lightbend/config For applications using application.{conf,json,properties}, system properties can be used to force a different config source (e.g. from command line -Dconfig.file=path/to/config-file):

  • config.resource specifies a resource name - not a basename, i.e. application.conf not application
  • config.file specifies a filesystem path, again it should include the extension, not be a basename
  • config.url specifies a URL

These system properties specify a replacement for application.{conf,json,properties}, not an addition. They only affect apps using the default ConfigFactory.load() configuration. In the replacement config file, you can use include "application" to include the original default config file; after the include statement you could go on to override certain settings.

it should be possible with these parameters.


回答1:


spark-submit \
    --class my.Class \
    --master "local[2]" \
    --files foo.conf \
    --conf spark.driver.extraJavaOptions="-Dvalue='abcd' -Dconfig.file=foo.conf" \
    target/scala-2.11/jar-0.1-SNAPSHOT.jar

changing from spark.driver.extraClassPathto spark.driver.extraJavaOptions is doing the trick



来源:https://stackoverflow.com/questions/47737103/cant-override-typesafe-configuration-on-commmandline-in-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!