I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:
(Answering my own question with an answer I got from another channel...)
To be able to run the Spark application from IntelliJ IDEA, you simply have to create a main class in the src/test/scala
directory (test
, not main
). IntelliJ will pick up the provided
dependencies.
object Launch {
def main(args: Array[String]) {
Main.main(args)
}
}
Thanks Matthieu Blanc for pointing that out.
For running the spark jobs, the general solution of "provided" dependencies work: https://stackoverflow.com/a/21803413/1091436
You can then run the app from either sbt
, or Intellij IDEA, or anything else.
It basically boils down to this:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated