spark sbt compile error libraryDependencies

后端 未结 3 1917
无人及你
无人及你 2021-01-25 00:18

1.2.0-bin-hadoop2.4 and my Scala version is 2.11.7. I am getting an error so I can\'t use sbt.

~/sparksample$ sbt

相关标签:
3条回答
  • 2021-01-25 00:21

    As @Till Rohrmann suggested you there's no such thing as spark-core_2.11.7 and your build.sbt appears to reference that library.

    I suggest you to edit the file /home/beyhan/sparksample/build.sbt and remove the references to that library.

    The correct reference is:

    libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"
    

    Remember that not only spark-core does not have any version 2.11.7 but also other spark libraries that you might be using.

    0 讨论(0)
  • 2021-01-25 00:24

    There exists no spark-core_2.11.7 jar file. You have to get rid of the maintenance version number .7 in the spark dependencies because spark-core_2.11 exists. All Scala versions with version 2.11 should be compatible.

    Update

    A minimal sbt file could look like

    name := "Simple Project"
    
    version := "1.0"
    
    scalaVersion := "2.11.7"
    
    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
    
    0 讨论(0)
  • 2021-01-25 00:25

    [info] Updating {file:/home/beyhan/sparksample/}default-f390c8... [info] Resolving org.scala-lang#scala-library;2.11.7 ... [info] Resolving org.apache.spark#spark-core_2.11.7;1.2.0 ... [warn] module not found: org.apache.spark#spark-core_2.11.7;1.2.0 [warn] ==== local: tried [warn] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public: tried [warn]

    0 讨论(0)
提交回复
热议问题