How to work efficiently with SBT, Spark and “provided” dependencies?

前端 未结 8 713
野性不改
野性不改 2021-01-31 02:47

I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:

  1. when I\'m developing under IntelliJ IDEA, I want Spark depend
相关标签:
8条回答
  • 2021-01-31 03:46

    (Answering my own question with an answer I got from another channel...)

    To be able to run the Spark application from IntelliJ IDEA, you simply have to create a main class in the src/test/scala directory (test, not main). IntelliJ will pick up the provided dependencies.

    object Launch {
      def main(args: Array[String]) {
        Main.main(args)
      }
    }
    

    Thanks Matthieu Blanc for pointing that out.

    0 讨论(0)
  • 2021-01-31 03:51

    For running the spark jobs, the general solution of "provided" dependencies work: https://stackoverflow.com/a/21803413/1091436

    You can then run the app from either sbt, or Intellij IDEA, or anything else.

    It basically boils down to this:

    run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
    runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
    
    0 讨论(0)
提交回复
热议问题