IntelliJ Idea 14: cannot resolve symbol spark

后端 未结 6 1770
花落未央
花落未央 2021-02-09 13:55

I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. The

相关标签:
6条回答
  • 2021-02-09 14:27

    This worked for me->

    name := "ProjectName"
    version := "0.1"
    scalaVersion := "2.11.11"
    
    libraryDependencies ++= Seq(
      "org.apache.spark" % "spark-core_2.11" % "2.2.0",
      "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
      "org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
    )
    
    0 讨论(0)
  • 2021-02-09 14:28

    I had a similar problem. It seems the reason was that the build.sbt file was specifying the wrong version of scala.

    If you run spark-shell it'll say at some point the scala version used by Spark, e.g.

    Using Scala version 2.11.8
    

    Then I edited the line in the build.sbt file to point to that version and it worked.

    0 讨论(0)
  • 2021-02-09 14:35

    name := "SparkLearning"

    version := "0.1"

    scalaVersion := "2.12.3"

    // additional libraries libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"

    0 讨论(0)
  • 2021-02-09 14:35

    I use

    scalaVersion := "2.11.7"
    
    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
    

    in my build.sbt and it works for me.

    0 讨论(0)
  • 2021-02-09 14:39

    Currently spark-cassandra-connector compatible with Scala 2.10 and 2.11.

    It worked for me when I updated the scala version of my project like below:

    ThisBuild / scalaVersion := "2.11.12"
    

    and I updated my dependency like:

    libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.0",
    

    If you use "%%", sbt will add your project’s binary Scala version to the artifact name.

    From sbt run:

    sbt> reload
    sbt> compile
    
    0 讨论(0)
  • 2021-02-09 14:44

    Your library dependecy conflicts with with the scala version you're using, you need to use 2.11 for it to work. The correct dependency would be:

    scalaVersion := "2.11.7"
    libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.4.1"
    

    note that you need to change spark_parent to spark_core

    0 讨论(0)
提交回复
热议问题