Apache Spark error while start

后端 未结 3 1352
有刺的猬
有刺的猬 2020-12-15 10:32

I want to enable single cluster in Apache Spark, I installed java and scala. I downloaded the spark for Apache Hadoop 2.6 and unpacked. I\'m trying to turn the spark-shell b

相关标签:
3条回答
  • 2020-12-15 10:59

    Above solution did not work for me. I followed these steps: How to start Spark applications on Windows (aka Why Spark fails with NullPointerException)?

    and changed HADOOP_HOME environment variable in system variable. It worked for me.

    0 讨论(0)
  • 2020-12-15 11:02

    It might be ownership issue as well

    hadoop fs -chown -R deepdive:root /user/deepdive/

    0 讨论(0)
  • 2020-12-15 11:23

    I've just begun to learn Spark, and I hope run Spark in local mode. I met a problem like yours. The problem:

    java.net.BindException: Failed to bind to: /124.232.132.94:0: Service 'sparkDriver' failed after 16 retries!

    Because I just wanted to run Spark in local mode, I found a solution to solve this problem. The solution: edit the file spark-env.sh (you can find it in your $SPARK_HOME/conf/) and add this into the file:

    export  SPARK_MASTER_IP=127.0.0.1
    export  SPARK_LOCAL_IP=127.0.0.1
    

    After that my Spark works fine in local mode. I hope this can help you! :)

    0 讨论(0)
提交回复
热议问题