Mac spark-shell Error initializing SparkContext

前端 未结 12 921
礼貌的吻别
礼貌的吻别 2020-12-07 09:26

I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using

\"./bin/spark-shell\". 

It has the error below.

相关标签:
12条回答
  • 2020-12-07 09:44

    Sometimes firewall prevents creating and binding a socket. make sure that your firewall is not enable and also you have to check the ip of your machine in /etc/hosts and make sure it's OK then try again:

    sudo ufw disable
    
    0 讨论(0)
  • 2020-12-07 09:46

    If you are using Scala to run the code in an IDE, and if you face the same issue and you are not using SparkConf() as pointed out above and using SparkSession() then you could bind the localhost address as follows as set only works in SparkConf(). You should use .config() to set the spark configuration as shown below:

        val spark = SparkSession
           .builder()
           .appName("CSE512-Phase1")
           .master("local[*]").config("spark.driver.bindAddress", "localhost")
           .getOrCreate()
    
    0 讨论(0)
  • Following steps might help:

    1. Get your hostname by using "hostname" command.

    2. Make an entry in the /etc/hosts file for your hostname if not present as follows:

      127.0.0.1      your_hostname
      

    Hope this helps!!

    0 讨论(0)
  • 2020-12-07 09:49

    If you don't want to change the hostname of your Mac, you can do the following:

    1. Find the template file spark-env.sh.template on your machine (It is probably in /usr/local/Cellar/apache-spark/2.1.0/libexec/conf/).
    2. cp spark-env.sh.template spark-env.sh
    3. Add export SPARK_LOCAL_IP=127.0.0.1 under the comment for local IP.

    Start spark-shell and enjoy it.

    0 讨论(0)
  • 2020-12-07 09:50

    There are two errors I think.

    1. Your spark local ip was not correct and needs to be change to 127.0.0.1.
    2. You didn't difine sqlContext properly.

    For 1. I tried:

    • 1) exported SPARK_LOCAL_IP="127.0.0.1" in ~/.bash_profile
    • 2) added export SPARK_LOCAL_IP="127.0.0.1" in load-spark-env.sh under $SPARK_HOME

    But neither worked. Then I tried the following and it worked:

    val conf = new SparkConf().
        setAppName("SparkExample").
        setMaster("local[*]").
        set("spark.driver.bindAddress","127.0.0.1")
    val sc = new SparkContext(conf)
    

    For 2. you can try:

    sqlContext = SparkSession.builder.config("spark.master","local[*]").getOrCreate()
    

    and then import sqlContext.implicits._

    The builder in SparkSession will automatically use the SparkContext if it exists, otherwise it will create one. You can explicitly create two if necessary.

    0 讨论(0)
  • 2020-12-07 09:51

    This happens when you switched between different networks(VPN - PROD, CI based on your company networks to access different environments).

    I had the same issue, whenever I switch the VPN.

    update sudo /etc/hosts with the hostname value on your Mac.

    0 讨论(0)
提交回复
热议问题