Spark fails to start in local mode when disconnected [Possible bug in handling IPv6 in Spark??]

后端 未结 6 2012
轻奢々
轻奢々 2021-02-05 16:53

The problem is the same as described here Error when starting spark-shell local on Mac

... but I have failed to find a solution. I also used to get the malformed URI err

相关标签:
6条回答
  • 2021-02-05 17:18

    If you are using pyspark, use the config method to set the host driver to localhost.

    spark = (SparkSession
                  .builder
                  .appName( "temp1" )
                  .config( "spark.driver.host", "localhost" )
                  .getOrCreate()
                  )
    
    0 讨论(0)
  • 2021-02-05 17:21

    I am not sure if this will help you, but it solved my problem on Mac.

    1) Get your hostname. (In terminal, this is usually the first part of the line (before the @ in Linux, before the : in Mac)) (In Mac, you can also type hostname in terminal to get your hostname)

    2) In /etc/hosts add:

    127.0.0.1 whatever-your-hostname-is

    For me, I originally had

    127.0.0.1 localhost

    but I changed it to

    127.0.0.1 my-hostname

    Save this change and retry pyspark.

    I got this solution from this stackoverflow: Mac spark-shell Error initializing SparkContext

    I hope this helps you.

    0 讨论(0)
  • 2021-02-05 17:23

    I faced the same issue while using the SharedSparkContext with my tests. Adding those two lines (in my beforeAll method) as @dennis suggested solved the problem for me :

      override def beforeAll(): Unit = {
        super.beforeAll()
        sc.getConf.setMaster("local").set("spark.driver.host", "localhost")
      }
    

    I hope this will be solved in the next versions of Spark.

    0 讨论(0)
  • 2021-02-05 17:25

    To those who are working with spark through sbt and having the same issue. Just add .set("spark.driver.host", "localhost") to your SparkConf() so initialisation of spark context will look like this:

    val conf = 
        new SparkConf()
        .setAppName( "temp1" )
        .setMaster( "local" )
        .set( "spark.driver.host", "localhost" )
    
    val sc = 
        SparkContext
        .getOrCreate( conf )
    

    This initial configuration must be done before any other getOrCreate of SparkContext.

    0 讨论(0)
  • 2021-02-05 17:34

    The first thing to check is probably /etc/hosts. Make sure that you have the following entry:

    127.0.0.1      localhost
    

    If the above does not work, then the following should do the trick:

     sudo hostname -s 127.0.0.1
    
    0 讨论(0)
  • 2021-02-05 17:37

    OK, I seem to be able to get around it by passing configuration directly --conf spark.driver.host=localhost

    So I run:

    ./bin/spark-shell --conf spark.driver.host=localhost
    

    Still if there is a better solution, please let me know.


    [UPDATE]

    Jacek Laskowski confirmed this is probably the only available solution for now.

    0 讨论(0)
提交回复
热议问题