Typesafe Config in Spark

蓝咒 提交于 2019-12-21 17:42:06

问题


I've defined a default configuration in my Spark application which is tucked in src/main/resources/reference.conf. I use ConfigFactory.load() to obtain the configuration.

When I run the application with spark-submit it picks up these defaults. However, when I only want to override a few of the configurations available in reference.conf and provide application.conf, it does not seem to pick up these overrides. From the documentation I thought that application.conf is merged with reference.conf when calling load(), so that it's not necessary to re-define everything in application.conf.

My reference.conf looks like this:

hdfs {
  rootDir: "/foo"
  dataDir: "hdfs://"${hdfs.rootDir}"/bar"
}

db {
  driver: "com.mysql.jdbc.Driver"
  ...
}

...

What I'd now like to do is have an application.conf with, say, only a custom hdfs section because the rest is the same.

I run my Spark app by supplying application.conf in both the --files parameter, in --driver-class-path, and --conf spark.executor.extraClassPath. This may be overkill but it works when I create a copy of reference.conf and change a few of the fields.

What am I missing?

来源:https://stackoverflow.com/questions/38068927/typesafe-config-in-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!