问题
I've defined a default configuration in my Spark application which is tucked in src/main/resources/reference.conf
. I use ConfigFactory.load()
to obtain the configuration.
When I run the application with spark-submit
it picks up these defaults. However, when I only want to override a few of the configurations available in reference.conf
and provide application.conf
, it does not seem to pick up these overrides. From the documentation I thought that application.conf
is merged with reference.conf
when calling load()
, so that it's not necessary to re-define everything in application.conf
.
My reference.conf
looks like this:
hdfs {
rootDir: "/foo"
dataDir: "hdfs://"${hdfs.rootDir}"/bar"
}
db {
driver: "com.mysql.jdbc.Driver"
...
}
...
What I'd now like to do is have an application.conf
with, say, only a custom hdfs
section because the rest is the same.
I run my Spark app by supplying application.conf
in both the --files
parameter, in --driver-class-path
, and --conf spark.executor.extraClassPath
. This may be overkill but it works when I create a copy of reference.conf
and change a few of the fields.
What am I missing?
来源:https://stackoverflow.com/questions/38068927/typesafe-config-in-spark