How to specify which java version to use in spark-submit command?

后端 未结 5 1592
走了就别回头了
走了就别回头了 2021-02-13 11:27

I want to run a spark streaming application on a yarn cluster on a remote server. The default java version is 1.7 but i want to use 1.8 for my application which is also there in

相关标签:
5条回答
  • 2021-02-13 11:56

    Add JAVA_HOME that you want in spark-env.sh (sudo find -name spark-env.sh ...ej. : /etc/spark2/conf.cloudera.spark2_on_yarn/spark-env.sh)

    0 讨论(0)
  • 2021-02-13 11:59

    Although you can force the Driver code to run on a particular Java version (export JAVA_HOME=/path/to/jre/ && spark-submit ... ), the workers will execute the code with the default Java version from the yarn user's PATH from the worker machine.

    What you can do is set each Spark instance to use a particular JAVA_HOME by editing the spark-env.sh files (documentation).

    0 讨论(0)
  • 2021-02-13 12:02

    The Java version would need to be set for both the Spark App Master and the Spark Executors which will be launched on YARN. Thus the spark-submit command must include two JAVA_HOME settings: spark.executorEnv.JAVA_HOME and spark.yarn.appMasterEnv.JAVA_HOME

    spark-submit --class com.example.DataFrameExample --conf "spark.executorEnv.JAVA_HOME=/jdk/jdk1.8.0_162" --conf "spark.yarn.appMasterEnv.JAVA_HOME=/jdk/jdk1.8.0_162" --master yarn --deploy-mode client /spark/programs/DataFrameExample/target/scala-2.12/dfexample_2.12-1.0.jar
    
    0 讨论(0)
  • 2021-02-13 12:03

    If you want to set java environment for spark on yarn, you can set it before spark-submit

    --conf spark.yarn.appMasterEnv.JAVA_HOME=/usr/java/jdk1.8.0_121 \
    
    0 讨论(0)
  • 2021-02-13 12:10

    JAVA_HOME was not enough in our case, the driver was running in java 8, but I discovered later that Spark workers in YARN were launched using java 7 (hadoop nodes have both java version installed).

    I had to add spark.executorEnv.JAVA_HOME=/usr/java/<version available in workers> in spark-defaults.conf. Note that you can provide it in command line with --conf.

    See http://spark.apache.org/docs/latest/configuration.html#runtime-environment

    0 讨论(0)
提交回复
热议问题