I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:
The main trick here is to create another subproject that will depend on the main subproject and will have all its provided libraries in compile scope. To do this I add the following lines to build.sbt:
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= spark.map(_ % "compile")
)
Now I refresh project in IDEA and slightly change previous run configuration so it will use new mainRunner module's classpath:
Works flawlessly for me.
Source: https://github.com/JetBrains/intellij-scala/wiki/%5BSBT%5D-How-to-use-provided-libraries-in-run-configurations