IntelliJ Idea 14: cannot resolve symbol spark

后端 未结 6 1771
花落未央
花落未央 2021-02-09 13:55

I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. The

6条回答
  •  予麋鹿
    予麋鹿 (楼主)
    2021-02-09 14:39

    Currently spark-cassandra-connector compatible with Scala 2.10 and 2.11.

    It worked for me when I updated the scala version of my project like below:

    ThisBuild / scalaVersion := "2.11.12"
    

    and I updated my dependency like:

    libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.0",
    

    If you use "%%", sbt will add your project’s binary Scala version to the artifact name.

    From sbt run:

    sbt> reload
    sbt> compile
    

提交回复
热议问题