I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. The
I had a similar problem. It seems the reason was that the build.sbt
file was specifying the wrong version of scala.
If you run spark-shell
it'll say at some point the scala version used by Spark, e.g.
Using Scala version 2.11.8
Then I edited the line in the build.sbt
file to point to that version and it worked.