ClassNotFoundException scala.runtime.LambdaDeserialize when spark-submit

前端 未结 4 1592
南笙
南笙 2020-12-19 00:07

I follow the Scala tutorial on https://spark.apache.org/docs/2.1.0/quick-start.html

My scala file

/* SimpleApp.scala */
import org.apache.spark.Spark         


        
相关标签:
4条回答
  • 2020-12-19 00:50

    as @Alexey said, change Scala version to 2.11 fixed the problem.

    build.sbt

    name := "Simple Project"
    
    version := "1.0"
    
    scalaVersion := "2.11.11"
    
    libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0" 
    

    Note that Scala version MUST MATCH with Spark. Look at the artifactId, spark-core_2.11 mean it was compatible with scala 2.11 (No backward or forward compatible)

    0 讨论(0)
  • 2020-12-19 00:52

    Following is the build.sbt entries for the latest Spark 2.4.1 release sample shown in Spark/Scala online guide:

    name := "SimpleApp" 
    version := "1.0"
    scalaVersion := "2.12.8"
    libraryDependencies += "org.apache.spark"  %% "spark-sql" % "2.4.1"
    

    Though everything works fine inside IntelliJ IDE, the application still fails with the following exception,

    Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize
    

    after creating the package with 'sbt package' command and running the spark-submit from the command line as the following;

    spark-submit -v --class SimpleApp --master local[*] target\scala-2.12\simpleapp_2.12-1.0.jar
    
    0 讨论(0)
  • 2020-12-19 00:55

    There is definitely some confusion regarding the Scala version to be used for Spark 2.4.3. As of today (Nov 25, 2019) the doc home page for spark 2.4.3 states:

    Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.3 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).

    Note that support for Java 7, Python 2.6 and old Hadoop versions before 2.6.5 were removed as of Spark 2.2.0. Support for Scala 2.10 was removed as of 2.3.0. Support for Scala 2.11 is deprecated as of Spark 2.4.1 and will be removed in Spark 3.0.

    Accordingly, the Scala version is supposed to be 2.12.

    0 讨论(0)
  • 2020-12-19 01:01

    I have similar issue while following the instructions provided at https://spark.apache.org/docs/2.4.3/quick-start.html

    My setup details: Spark version: 2.4.3 Scala version: 2.12.8

    However, when i changed my sbt file to below configuration everything worked fine.(both compilation and running the application jar)

    name := "Simple Project"

    version := "1.0"

    scalaVersion := "2.11.11"

    libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.3"

    It looks like spark 2.4.3 is compatible with 2.11.11 Scala version only. While compiling the sample project sbt has downloaded the Scala 2.11 library from "https://repo1.maven.org/maven2/org/scala-lang/scala-library/2.11.11"

    0 讨论(0)
提交回复
热议问题