Using typesafe config with Spark on Yarn

前端 未结 2 1111
[愿得一人]
[愿得一人] 2021-02-06 11:20

I have a Spark job that reads data from a configuration file. This file is a typesafe config file.

The code that reads the config looks like that:

Config         


        
2条回答
  •  别那么骄傲
    2021-02-06 11:29

    So with a little digging in the Spark 1.6.1 source code I found the solution.

    These are the steps that you need to take in order to get both the log4j and the application.conf being used by your application when submitting to yarn using cluster mode:

    • When passing several files like I was doing passing both the application.conf and log4j.xml file you need to submit them using just one line like this: --files "$ROOT_DIR/application.conf,$LOG4J_FULL_PATH/log4j.xml" (separate them by comma)
    • Thats it for the application.conf. There's no need for the extraJavaOpts for the application.conf (as was written in my question). The issue is that Spark was using only the last --files argument that was passed and thats why log4j was being passed. In order to use log4j.xml I also had to take the following step
    • Add another line to the spark submit like this: --conf spark.driver.extraJavaOptions="-Dlog4j.configuration=file:log4j.xml" - notice that once you pass it with --files you can just refer to the file name without any path

    Note: I haven't tried it but from what I saw if you're trying to run it in client mode I think the spark.driver.extraJavaOptions line should be renamed to something like driver-java-options Thats it. So simple and I wish these things were documented better. I hope this answer will help someone

    Cheers

提交回复
热议问题