问题
I'm building an Apache Spark application in Scala and I'm using SBT to build it. Here is the thing:
- when I'm developing under IntelliJ IDEA, I want Spark dependencies to be included in the classpath (I'm launching a regular application with a main class)
- when I package the application (thanks to the sbt-assembly) plugin, I do not want Spark dependencies to be included in my fat JAR
- when I run unit tests through
sbt test
, I want Spark dependencies to be included in the classpath (same as #1 but from the SBT)
To match constraint #2, I'm declaring Spark dependencies as provided
:
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
...
)
Then, sbt-assembly's documentation suggests to add the following line to include the dependencies for unit tests (constraint #3):
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
That leaves me with constraint #1 not being full-filled, i.e. I cannot run the application in IntelliJ IDEA as Spark dependencies are not being picked up.
With Maven, I was using a specific profile to build the uber JAR. That way, I was declaring Spark dependencies as regular dependencies for the main profile (IDE and unit tests) while declaring them as provided
for the fat JAR packaging. See https://github.com/aseigneurin/kafka-sandbox/blob/master/pom.xml
What is the best way to achieve this with SBT?
回答1:
Use the new 'Include dependencies with "Provided" scope' in an IntelliJ configuration.
回答2:
(Answering my own question with an answer I got from another channel...)
To be able to run the Spark application from IntelliJ IDEA, you simply have to create a main class in the src/test/scala
directory (test
, not main
). IntelliJ will pick up the provided
dependencies.
object Launch {
def main(args: Array[String]) {
Main.main(args)
}
}
Thanks Matthieu Blanc for pointing that out.
回答3:
You need to make the IntellJ work.
The main trick here is to create another subproject that will depend on the main subproject and will have all its provided libraries in compile scope. To do this I add the following lines to build.sbt:
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= spark.map(_ % "compile")
)
Now I refresh project in IDEA and slightly change previous run configuration so it will use new mainRunner module's classpath:
Works flawlessly for me.
Source: https://github.com/JetBrains/intellij-scala/wiki/%5BSBT%5D-How-to-use-provided-libraries-in-run-configurations
回答4:
A solution based on creating another subproject for running the project locally is described here.
Basically, you would need to modifiy the build.sbt
file with the following:
lazy val sparkDependencies = Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion
)
libraryDependencies ++= sparkDependencies.map(_ % "provided")
lazy val localRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
And then run the new subproject locally with Use classpath of module: localRunner
under the Run Configuration.
回答5:
[Obsolete] See new answer "Use the new 'Include dependencies with "Provided" scope' in an IntelliJ configuration." answer.
The easiest way to add provided
dependencies to debug a task with IntelliJ
is to:
- Right-click
src/main/scala
- Select
Mark Directory as...
>Test Sources Root
This tells IntelliJ
to treat src/main/scala
as a test folder for which it adds all the dependencies tagged as provided
to any run config (debug/run).
Every time you do a SBT refresh, redo these step as IntelliJ will reset the folder to a regular source folder.
回答6:
You should be not looking at SBT for an IDEA specific setting. First of all, if the program is supposed to be run with spark-submit, how are you running it on IDEA ? I am guessing you'd be running as standalone in IDEA, while running it through spark-submit normally. If that's the case, add manually the spark libraries in IDEA, using File|Project Structure|Libraries. You'll see all dependencies listed from SBT, but you can add arbitrary jar/maven artifacts using the + (plus) sign. That should do the trick.
回答7:
For running the spark jobs, the general solution of "provided" dependencies work: https://stackoverflow.com/a/21803413/1091436
You can then run the app from either sbt
, or Intellij IDEA, or anything else.
It basically boils down to this:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
回答8:
Why not bypass sbt and manually add spark-core and spark-streaming as libraries to your module dependencies?
- Open the Project Structure dialog (e.g. ⌘;).
- In the left-hand pane of the dialog, select Modules.
- In the pane to the right, select the module of interest.
- In the right-hand part of the dialog, on the Module page, select the Dependencies tab.
- On the Dependencies tab, click add and select Library.
- In the Choose Libraries dialog, select new library, from maven
- Find spark-core. Ex
org.apache.spark:spark-core_2.10:1.6.1
- Profit
https://www.jetbrains.com/help/idea/2016.1/configuring-module-dependencies-and-libraries.html?origin=old_help#add_existing_lib
来源:https://stackoverflow.com/questions/36437814/how-to-work-efficiently-with-sbt-spark-and-provided-dependencies