How to work efficiently with SBT, Spark and “provided” dependencies?

前端 未结 8 724
野性不改
野性不改 2021-01-31 02:47

I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:

  1. when I\'m developing under IntelliJ IDEA, I want Spark depend
8条回答
  •  执念已碎
    2021-01-31 03:46

    (Answering my own question with an answer I got from another channel...)

    To be able to run the Spark application from IntelliJ IDEA, you simply have to create a main class in the src/test/scala directory (test, not main). IntelliJ will pick up the provided dependencies.

    object Launch {
      def main(args: Array[String]) {
        Main.main(args)
      }
    }
    

    Thanks Matthieu Blanc for pointing that out.

提交回复
热议问题