Why does spark-submit and spark-shell fail with “Failed to find Spark assembly JAR. You need to build Spark before running this program.”?

前端 未结 9 1674
一个人的身影
一个人的身影 2020-12-17 07:33

I was trying to run spark-submit and I get \"Failed to find Spark assembly JAR. You need to build Spark before running this program.\" When I try to run spark-shell

相关标签:
9条回答
  • 2020-12-17 08:12

    In my case, I install spark by pip3 install pyspark on macOS system, and the error caused by incorrect SPARK_HOME variable. It works when I run command like below:

    PYSPARK_PYTHON=python3 SPARK_HOME=/usr/local/lib/python3.7/site-packages/pyspark python3 wordcount.py a.txt
    
    0 讨论(0)
  • 2020-12-17 08:13

    Just to add to @jurban1997 answer.

    If you are running windows then make sure that SPARK_HOME and SCALA_HOME environment variables are setup right. SPARK_HOME should be pointing to {SPARK_HOME}\bin\spark-shell.cmd

    0 讨论(0)
  • 2020-12-17 08:20

    Try running mvn -DskipTests clean package first to build Spark.

    0 讨论(0)
  • 2020-12-17 08:24

    Spark Installation:

    For Window machine:

    Download spark-2.1.1-bin-hadoop2.7.tgz from this site https://spark.apache.org/downloads.html
    
    Unzip and Paste your spark folder in C:\ drive and set environment variable.
    
    If you don’t have Hadoop,
    you need to create Hadoop folder and also create Bin folder in it and then copy and paste winutils.exe file in it.
    
    download winutils file from [https://codeload.github.com/gvreddy1210/64bit/zip/master][1] 
    
    and paste winutils.exe file in Hadoop\bin folder and set environment variable for c:\hadoop\bin;
    
    create temp\hive folder in C:\ drive and give the full permission to this folder like: 
    
    C:\Windows\system32>C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive
    
    open command prompt first run C:\hadoop\bin> winutils.exe  and then navigate to C:\spark\bin>
    
    run spark-shell
    
    
    0 讨论(0)
  • 2020-12-17 08:30

    On Windows, I found that if it is installed in a directory that has a space in the path (C:\Program Files\Spark) the installation will fail. Move it to the root or another directory with no spaces.

    0 讨论(0)
  • 2020-12-17 08:31

    Your Spark package doesn't include compiled Spark code. That's why you got the error message from these scripts spark-submit and spark-shell.

    You have to download one of pre-built version in "Choose a package type" section from the Spark download page.

    0 讨论(0)
提交回复
热议问题