How to work efficiently with SBT, Spark and “provided” dependencies?

前端 未结 8 728
野性不改
野性不改 2021-01-31 02:47

I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:

  1. when I\'m developing under IntelliJ IDEA, I want Spark depend
8条回答
  •  慢半拍i
    慢半拍i (楼主)
    2021-01-31 03:44

    You need to make the IntellJ work.

    The main trick here is to create another subproject that will depend on the main subproject and will have all its provided libraries in compile scope. To do this I add the following lines to build.sbt:

    lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
      libraryDependencies ++= spark.map(_ % "compile")
    )
    

    Now I refresh project in IDEA and slightly change previous run configuration so it will use new mainRunner module's classpath:

    Works flawlessly for me.

    Source: https://github.com/JetBrains/intellij-scala/wiki/%5BSBT%5D-How-to-use-provided-libraries-in-run-configurations

提交回复
热议问题