When running with master 'yarn' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment

前端 未结 2 569
小鲜肉
小鲜肉 2021-01-12 14:37

I am trying to run Spark using yarn and I am running into this error:

Exception in thread \"main\" java.lang.Exception: When running with master \'yarn\' either

相关标签:
2条回答
  • 2021-01-12 15:17

    just an update to answer by Shubhangi,

     cd $SPARK_HOME/bin
     sudo nano load-spark-env.sh
    

    add below lines , save and exit

    export SPARK_LOCAL_IP="127.0.0.1"

    export HADOOP_CONF_DIR="$HADOOP_HOME/etc/hadoop"

    export YARN_CONF_DIR="$HADOOP_HOME/etc/hadoop"

    0 讨论(0)
  • 2021-01-12 15:31

    While running spark using Yarn, you need to add following line in to spark-env.sh

    export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
    

    Note: check $HADOOP_HOME/etc/hadoop is correct one in your environment. And spark-env.sh contains export of HADOOP_HOME as well.

    0 讨论(0)
提交回复
热议问题