I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:
Why not bypass sbt and manually add spark-core and spark-streaming as libraries to your module dependencies?
org.apache.spark:spark-core_2.10:1.6.1
https://www.jetbrains.com/help/idea/2016.1/configuring-module-dependencies-and-libraries.html?origin=old_help#add_existing_lib
[Obsolete] See new answer "Use the new 'Include dependencies with "Provided" scope' in an IntelliJ configuration." answer.
The easiest way to add provided
dependencies to debug a task with IntelliJ
is to:
src/main/scala
Mark Directory as...
> Test Sources Root
This tells IntelliJ
to treat src/main/scala
as a test folder for which it adds all the dependencies tagged as provided
to any run config (debug/run).
Every time you do a SBT refresh, redo these step as IntelliJ will reset the folder to a regular source folder.
A solution based on creating another subproject for running the project locally is described here.
Basically, you would need to modifiy the build.sbt
file with the following:
lazy val sparkDependencies = Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion
)
libraryDependencies ++= sparkDependencies.map(_ % "provided")
lazy val localRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
And then run the new subproject locally with Use classpath of module: localRunner
under the Run Configuration.
Use the new 'Include dependencies with "Provided" scope' in an IntelliJ configuration.
The main trick here is to create another subproject that will depend on the main subproject and will have all its provided libraries in compile scope. To do this I add the following lines to build.sbt:
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= spark.map(_ % "compile")
)
Now I refresh project in IDEA and slightly change previous run configuration so it will use new mainRunner module's classpath:
Works flawlessly for me.
Source: https://github.com/JetBrains/intellij-scala/wiki/%5BSBT%5D-How-to-use-provided-libraries-in-run-configurations
You should be not looking at SBT for an IDEA specific setting. First of all, if the program is supposed to be run with spark-submit, how are you running it on IDEA ? I am guessing you'd be running as standalone in IDEA, while running it through spark-submit normally. If that's the case, add manually the spark libraries in IDEA, using File|Project Structure|Libraries. You'll see all dependencies listed from SBT, but you can add arbitrary jar/maven artifacts using the + (plus) sign. That should do the trick.