UNRESOLVED DEPENDENCIES error while trying to create jar

后端 未结 5 1277
萌比男神i
萌比男神i 2021-02-04 02:14

I\'m trying to build a Scala jar file to run it in spark.
I\'m following this tutorial.
when trying to build jar file using sbt as here, i\'m facing with following error

相关标签:
5条回答
  • 2021-02-04 02:35

    How can you change the current dependencies? I mean, when you type sbt package for a build file like:

    name := "Simple Project"
    
    version := "1.0"
    
    scalaVersion := "2.10.4"
    
    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
    

    SBT will start resolving and downloading all kinds of dependencies. But if you see that it is failing on a dependency that is no longer inthe maven repo, what to do? Where can you change the dpencies it tries.

    @OP: The problem is that your SBT is outdated. If you downloaded it using apt, you can use apt to remove it as well. In any case, download the latest .tgz (not the .deb) and simply unpack it, after that add the /sbt/bin/ folder to your .bashrc I noticed that older SBT's (the .deb and apt-get versions) work with older scala versions. You either need to manually add or change the dependencies that the older SBT is trying to find or simply change to the latest (not sooo)SBT.

    0 讨论(0)
  • 2021-02-04 02:43

    I had the same issue. Looks like that some bugs are in different versions/compilations/etc.

    For me the following build.sbt worked fine

    name := "My Project"  
    version := "1.0"  
    scalaVersion := "2.11.8"  
    libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.2"  
    

    Hope it helps

    0 讨论(0)
  • 2021-02-04 02:44

    You have your dependency defined as

    "org.apache.spark" %% "spark-core" % "1.0.2"
    

    That %% instructs sbt to substitute current scala version to artifact name. Apparently, spark was build for the whole family of 2.10 scala, without specific jars for 2.10.1, 2.10.2 ...

    So all you have to do is to redefine it as:

    "org.apache.spark" % "spark-core_2.10" % "1.0.2"
    
    0 讨论(0)
  • 2021-02-04 02:45
    libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.1.0",
    "org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016",
    "org.eclipse.jetty.orbit" % "javax.transaction" % "1.1.1.v201105210645",
    "org.eclipse.jetty.orbit" % "javax.mail.glassfish" % "1.4.1.v201005082020"
    

    )

    0 讨论(0)
  • 2021-02-04 02:48

    spark-core_2.10.4;1.0.2 means that it is build on top of scala 2.10 vesion. so you have to specified this scalaVersion := "2.10.4" in your build file. Please check your .sbt file and change accordingly.

    0 讨论(0)
提交回复
热议问题