on Amazon EMR 4.0.0, setting /etc/spark/conf/spark-env.conf is ineffective

前端 未结 2 1291
梦如初夏
梦如初夏 2021-01-23 03:56

I\'m launching my spark-based hiveserver2 on Amazon EMR, which has an extra classpath dependency. Due to this bug in Amazon EMR:

https://petz2000.wordpress.com/2015/08/1

2条回答
  •  栀梦
    栀梦 (楼主)
    2021-01-23 04:07

    Have you tried setting spark.driver.extraClassPath in spark-defaults? Something like this:

    [
      {
        "Classification": "spark-defaults",
        "Properties": {
          "spark.driver.extraClassPath": "${SPARK_CLASSPATH}:${HADOOP_HOME}/*:${HADOOP_HOME}/../hadoop-hdfs/*:${HADOOP_HOME}/../hadoop-mapreduce/*:${HADOOP_HOME}/../hadoop-yarn/*:/home/hadoop/git/datapassport/*"
        }
      }
    ]
    

提交回复
热议问题